||Jul 20, 2012 05:05 PM
||May 9, 2018 08:35 AM
||May 12, 2018 10:04 PM
aQuant has contributed to 28 posts out of 18565 total posts
(0.15%) in 2,190 days (0.01 posts per day).
20 Most recent posts:
Is anyone looking into this? Looking in the past (year 2015 for example), this problem used to affect more securities but now seems to only linger still in treasury futures.
Some level2 updates are being transmitted with missing digits, resulting in non-tradable prices for US Treasury futures (cme-globex). Here is a snapshot of data 4/30/2018 data with times being EST (iqfeed original binary data)
119.4062 is not a tradable price, 119.40625 is (the 5th decimal digit is missing in the above update)
I am attaching a file from 4/30/2018 with the following structure:
cme ticker, timestamp in CST (hh:mm:ss.fffffff-to a tenth of a microsecond), bid,ask, depth (where depth is zero based, i.e. 2 means MD03 update in iqfeed terms). I scanned data from 1/1/2018-4/30/2018 and this happens daily. I can provide a full list of instances with times similar to the attached file.
Almost 2 years later, any update on this topic? Is this still on your roadmap, any closer? Thank you.
Is there any update on the above? Was it internally discussed? What is the conclusion? Thank you.
Thank you. I did send an email. I think this feature is catering to advanced audience-but the more precious for those knowing how to use the information. It would be very valuable to have it.
I would like to receive order count in addition to the already available price/quantity information for Level2 data as shown here:
It should be very easy to implement and you are already getting the data from CME, just a matter of passing it on to users. Please let me know your thoughts. Thank you.
Actually, apologize to bother you. I have found the link, here it is for those interested:
CME documents bundling somewhat in their documentation of MDP3, it's not detailed enough however. Could you shed light (very briefly) on how they bundle those trades in the new protocol?
Excellent, appreciate your efforts as always.
Thank you. Once you switch to that protocol, do you have plans to include microsecond precision time stamps provided by CME?
I would like to find out what protocol IQFeed uses to access cme's data, is it MDP3?
Note that I do mention multiple clients above, but in this case I only had one client/application using the feed.
My connection crashed today too-without any attempt to reconnect. This is the first time I experienced iqconnect.exe completely shutting down without any warning. On a positive note, I really like your data/service overall. The reconnection/handling of multiple clients has always seemed an unstable feature of this feed and has caused me to lose data on a number of days over the years I have been using it-I do save full L1 and L2 data so there is no way for me to get them from historical download once the connection is down/not reset properly. I really wish this could be made stable, it would save a lot of trouble to a lot of people who use this feed for serious analytics.
Yes, I am aware of the historical data restriction. All of this is live data. Thanks, will check my data after Oct 1st.
I can check in my current data, could you specify starting date from which I shouldn't see it? Also, does it require by any chance using the latest version of IQFeed, currently I am using 126.96.36.199.
Finally, has this been-to your knowledge-an issue that has existed for ever/long time or was it just a temporary occurrence (if yes, what date ranges).
one thing that may be beneficial to sophisticated individuals with a limited budget (below NaNex prices) would be if Level1 updates and Level2 updates could come in a single sequence just as they do from CME for example. In other words the two feeds which are now 'independent' entities would come in a single sequence so that synchronization is guaranteed. For sophisticated orderflow analysis one needs to see certain events in the sequence they happened (trades interleaved with bid/ask size/orderbook updates). It is practically impossible to synchronize two independent L1 and L2 feeds even with millisecond timestamps-other than assuming neither of the feeds has any errors/omissions. Is an implementation of this feature conceivable?
I found that these happen quite a bit (especially in certain special fast conditions). Note that the third line below ('ba' update) shows 0 ask size at 97.97 price. I was able to match this particular sequence with market depth MD01 updates and there the size shows 14. Is there an explanation for this? (this particular sequence is from August, 6th, 2014 CLU4 contract). Can provide more examples if needed.
Great you found the source of error. Do you have any idea when you will be releasing the new corrected version?
Some info on my side: applications using 4.9 protocol don't use (or subscribe) to Level2 data. Only the application running
5.0.x protocol (latest official release) uses Level2 data.
Here are a few lines of raw Level2 data, at some point between 0-1am CST I was disconnected from Level2 (not sure if it
was internet outage on my side).
Here is how a correct time stamp message looks like before disconnect:
2~NÍ#Ð;T,20130521 01:00:15 (note: the part starting with "2~" and ending with ";" is my own stuff, rest is IQfeed generated)
Here are a few lines at and after disconnect:
After that disconnect, the connection was reestablished but since then the time stamp messages have been all gibberish:
2~‹H5TØ#Ð;TQùÝ¤pô (again note: the part starting with "2~" and ending with ";" is my own stuff, rest is IQfeed generated)
Also the above line containing "BINARY_HEADER1,COMPRESSION_TYPE1" is unexpected to me.
The popups with that error have never gone away and I also noticed that I only have about 1/2 of the volume of Level2 data
for today compared to most days (including yesterday), so the feed recovered somehow but not fully.
Let me know what more specifics you would like, maybe I can dig them out.
Similarly as the other poster, I use sockets in C# to get raw data in.
Also, to add, I have run the applications (4.9 and 5.0 version) side by side for a few weeks now without the issue.