beem is now faster than ever and batched rpc calls possible

beem is a almost complete python library for steem. It is well tested with 286 unit tests and a coverage of around 77%. The library is now in beta state and its latest version is 0.19.14.


Batch api calls on AppBase

Batch api calls are possible now with AppBase. If you call a Api-Call with add_to_queue=True it is not submitted but stored in rpc_queue. When a call with add_to_queue=False (default setting) is started, the complete queue is sended at once to the node. The result is a list with replies.

from beem import Steem
stm = Steem("")
[{'method': 'condenser_api.get_config', 'jsonrpc': '2.0', 'params': [], 'id': 6}]
result = stm.rpc.get_block({"block_num":1}, api="block", add_to_queue=False)

Websocket performance

The websocket performance for getting 24h on Blockchain data could be increased by using threads.
28790 Blocks are fetched from wss:// in 18 minutes and 22 seconds.


Using the script from, 24h Blockchain data could be processed in 11 minutes and 23 seconds. In both cases 16 threads were used.


Bug fixes and more unit tests

  • Bug fix for account
  • Block structure changed, is alwas {‘block’: , ‘id’} now.
  • block function removed, Block is used now
  • Bug fix in comment
  • Discussions_by_payout removed
  • Several bugfixes in witness
  • More exception added to steemnoderpc
  • Unit tests for appbase nodes with parameterized added
  • steemnoderpc-unittest added

fix unit test and prepare release of 0.19.13

Improve code and try to fix unit test for py27 and py34

Code improvements

Bugfixes and Improvements

More unit tests

Websocket, Graphenerpc and Notify refactored

Block api changed and graphenerpc improved

Posted on - Rewarding Open Source Contributors