||Nov 21, 2005 02:10 PM
||Apr 27, 2018 01:46 PM
||May 21, 2018 09:30 AM
DTN_Steve_S has contributed to 2037 posts out of 18499 total posts
(11.01%) in 4,568 days (0.45 posts per day).
20 Most recent posts:
There are a couple changes to that file that could affect the reliability of the code (I am unable to tell since the changes call functions that aren't included). However, by modifying the same example app to adding a simple re-request loop that triggers when the ENDMSG is received, I was able to cycle through ~27K requests without error before I stopped it.
I'm not discounting the idea that the example code might have a bug in it that causes this. The example code is designed to simply demonstrate the communication and isn't very robust. Of course it is also possible that it could be a bug in the feed itself although that would be much less likely.
My recommendation to continue troubleshooting would be for you to examine the calls to BeginReceive and EndReceive to make sure they are always paired correct. For every call to BeginReceive, there should be exactly one call to EndRecieve and it should happen before the next call to BeginReceive.
I would also add an extra Array.Clear on the socket buffer array before calling EndReceive just to be 100% sure that any data you are processing is new off the socket instead of somehow leftover from the previous read.
You might also want to turn on logging in IQConnect (you can do this via the diagnostics app) to see what the feed is actually sending and compare this to what your app is receiving. Unfortunately, the logging in IQConnect will certainly affect performance of the feed and will likely result in a very large text file so it could be troublesome to work with.
Edited by DTN_Steve_S on Apr 27, 2018 at 01:48 PM
Hello, this is almost certainly a case of either not properly dealing with partial messages you read off the socket or not properly clearing resources between requests.
By "partial messages", I'm referring to when you read from a socket into a buffer, there is no guarantee that the last characters read will be at a message break which means you have to save off the partial message you received and prepend it to the beginning of the data on the next socket read.
Henry, I sent an email to your registered email address with this information.
Feel free to reply to that email if you need additional information.
In that case, I would try splitting to up to 2 connections with 150 symbols each or 3 connections with 100 symbols each and see if the problem goes away.
Can you give an example of the dropped bars issue?
No, I wouldn't split this out the extent of one per symbol. I would try to get it working with a single connection first and then only scale it out if you need to, starting with 2-4 connections and working from there.
The streaming bars can actually handle multiple intervals of the same symbol and it can also handle multiple symbols (of the same or differing intervals) on the same connection as well.
You simply need to specify a unique value in the RequestID on each of your bar watch requests (so submit 4 bar watch requests for each symbol, each with a unique RequestID).
With that said, you might run into some issues dealing with performance based on your description of what you need if you put everything on a single socket. Since it's entirely hardware dependent, I can't say for sure if you will run into problems but I just mention it so you're aware that it could come up. If you do have issues, and assuming you aren't saturating all cores on the CPU, you will need to spread your symbols across multiple connections to streaming bars. Streaming bars is multithreaded on a per client basis so each connection you make to streaming bars will be handled on it's own thread. If that becomes an issue, make sure you keep all intervals of a single symbol on the same connection for maximum performance.
Sorry for the delay in response here.
We calculate VWAP the same for all symbols that have it available.
The VWAP value takes into account all trades (and all trade volumes) for the current day (resets on the first trade of the day)
A good example currently is the symbol GOOG which as of this post, only has 13 trades thus far today.
Price Size Type
1051.51 100 E
1051.3 99 O
1051.3 1 O
1051.3 100 E
1051.14 4 O
1051.23 74 O
1051.23 7 O
1051.23 16 O
1051.23 100 E
1050.89 3 O
1050.89 1 O
1041.16 4 O
1048.5 1 O
The current VWAP value is 1051.22
Sorry for the delay Craig. This is not available in the API.
I responded to this via email but I'm also responding here for the benefit of future readers.
IQFeed restricts all connections to the local machine due to restrictions by the exchanges prohibiting redistribution of data.
Hello, I got your messages via email. Unfortunately I don't have an answer for you however, I will say (mostly for future readers of this post) that the format that should be adhered to for these fields is CCYYMMDD HHmmSS (with a single space between date/time). I understand you're just troubleshooting and it's good information that the space seems to be what is corrupting your requests but I'm not sure how currently. At this point I think we've determined it has to be something in your setup.
Do you have the ability to compile/run any of the supplied example apps we distribute with the feed? If so, do they show the same issues? If not, it looks like your account is authorized for our DTN.IQ client. http://www.dtniq.com/template.cfm?navgroup=supportlist&urlcode=33&view=1
Can you test using this (open the Time & Sales and configure the request to use a date range).
Also, which version of python are you using? and do you have any environment variables set that could be altering the run of the app?https://docs.python.org/2/using/cmdline.html#environment-variables You might try running your app with the -E parameter just to make sure that isn't the issue.
This information is not in the Nasdaq Level 2 feed that we receive. As a result, to get this information, it would require bringing in a new feed from exchange(s) which is not a small task. However, we do track requests for additional data to help us measure demand. If you would like to PM me your IQFeed loginID, I'd be happy to submit a request for this information on your behalf.
Craig, sorry for the delay in responding.
What you are showing here is expected behavior. The bid/ask quotes for these equities come from the exchange on a different feed and we merge them in-house before forwarding them to customers. Any effort to assure they are in chronological order would result in intentionally delaying at least one of the feeds while waiting for data from the other feed. Instead, we opt for the option that gets data to customers as quickly as possible. As a result, you will occasionally see discrepancies between trades timestamps and bid/ask timestamps in this scenario.
Good catch. We do not provide the interest rate within the feed. Our chains display leaves this field up to the user to populate.
Hello, sorry for the late response. We do not provide the greeks themselves within the API but there should be sufficient underlying data within the API that you can do these calculations within your application.
We also provide an option chains display within our DTN.IQ product (your account might already be authorized to use this) that shows greeks but they aren't available programatically.
In that case, the term derivative isn't intended to be industry specific. Instead it is indicating that the data available on that port is derived from other data available in the feed. It should work for all symbols/instruments we carry.
Thanks for the alert. These pages are now updating again. We are looking into why our monitoring for these pages failed to notice the lack of updates.
Thanks for the post. Just for posterity, the IQFeed installer (every version for the last 10+years) should be installing these files by downloading an running the Microsoft installer for runtimes. For Linux users running IQFeed under WINE, this should still work (in our testing it has) as long as you are running the IQFeed Installer under wine as well.
If, for whatever reason it doesn't work on Linux (or windows for that matter), our official recommendation is to download and run the Microsoft installer manually (You can run it under wine if on Linux). We host a copy of the installer that we build against on our website. For the current version (5.2), use the following link: https://www.iqfeed.net/vcredist_x86_2012u4.exe
If that still doesn't work, then the above fix will likely work for you (if on Linux).
Hello, we do not have listings of constituents available. As for your other question about getting a list of all symbols in a specific market, you have two options.
The first option is that we publish a text file of every active symbol in our system daily here:
You can parse that file to pull out whichever market(s) you need.
Alternatively, you can filter the symbol lookup requests in the API by security type or listed market (multiple markets can be sent space delimited) and just use a * wildcard in the search field to get a list from each market.
I can confirm this appears to be an issue in the servers. I haven't been able to identify exactly what is happening but I can duplicate the issue. I'll report this over to our server team for investigation.
For the current time, the workaround that seems to work consistently is to request the current day's data with a separate request.
Unfortunately we don't provide a snapshot data option at this time and if implemented, the likely lower limit would be 1s.
With that said, I took a look at the code you provided and the obvious thing that jumps out at me is the use of String.Split. This function allocates a new array and a new string object for each and every field on each and every message. When dealing with potentially thousands of messages per second (possibly 10s of thousands), this is going to be very inefficient, especially since these are temporary objects and you are immediately converting the fields to binary. In order to efficiently process the feed, you need to eliminate as many of these types of temporary variables as possible in your processing.
Also, make sure you are using the dynamic fieldsets feature of the feed to eliminate any fields that you aren't interested in processing. I can't tell from the code snipit if you are using this or not but make sure you are.