For a deeper look into our Eikon Data API, look into:

Overview |  Quickstart |  Documentation |  Downloads |  Tutorials |  Articles

question

Upvotes
Accepted
5 0 1 0

How to fetch large amounts of data efficiently

Hi,

I noticed that its taking me a very long amount of time to fetch large amounts of data. For example, I'm currently trying to fetch mid price & yield for 50000 securities for all of 2020. It takes around a day of loading to get 1 month of data, and I was wondering how to do this more efficiently?

I'm not sure if its our usage of the eikon data api, or perhaps some existing inefficiencies in our code interacting with our database that's causing this to take so long.

eikoneikon-data-apiworkspaceworkspace-data-apirefinitiv-dataplatform-eikonpython
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

1 Answer

Upvotes
Accepted
23k 22 9 14

Hello @kevin.guo,

In terms of requesting large data sets with Eikon Data API, you may find this previous discussion to be relevant, to gauge what to expect in terms of performance.

I am assuming that you can, for the purposes of verification, temporarily disable database insert path, and test retrieval via Eikon Data API, separately.

In terms of requesting really large historical data sets, quickly and asynchronously, HTTP REST over python, you may wish to take a look at different product, Tick History API use and product specs.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Click below to post an Idea Post Idea