|
Join the 80,000 other DTN customers who enjoy the fastest, most reliable data available. There is no better value than DTN!
(Move your cursor to this area to pause scrolling)
"I ran your IQFeed DDE vs. my broker vs. a level II window for some slow-moving options. I would see the level II quote change, then your feed update instantaneously. My broker's DDE, however, would take as much as 30 seconds to update. I am not chasing milliseconds, but half a minute is unacceptable." - Comment from Rob
"Very impressed with the quality of your feed - ******* is a real donkey in comparison." - Comment from A.C. via Email
"IQFeed version 4 is a real screamer compared to anything else I have seen." - Comment from Tom
"I am keeping IQFeed, much better reliabilty than *******. I may refer a few other people in the office to switch as well." - Comment from Don
"IQ feed works very well, does not have all of the normal interruptions I have grown used to on *******" - Comment from Mark
"After all the anxiety I had with my previous data provider it is a relief not to have to worry about data speed and integrity." - Comment from Eamonn
"I would just like to say that IQFeed version 4 is running very well and I am very happy with its performance. I would also like to extend a big thanks for the fast and efficient help that I always receive. My questions and concerns are always addressed promptly. Way to go!" - Comment from Josh in CO.
"Version 4.0.0.2 has been working well for me and I appreciate that it is now a much tighter client to work with. I feel I can go to press with my own application and rely on a stable platform" - Comment from David in IA.
"My broker in Davenport suggested I give you a try as he uses your service and says its the best." - Comment from Bill via RT Chat
"Just a thank you for the very helpful and prompt assistance and services. You provided me with noticeably superior service in my setup compared to a couple of other options I had looked at." - Comment from John
|
|
|
|
Joined: |
Apr 2, 2013 08:04 PM |
Last Post: |
Sep 20, 2013 10:09 AM |
Last Visit: |
Sep 20, 2013 10:09 AM |
Website: |
http://gamozolabs.com |
Location: |
Virginia |
Occupation: |
|
Interests: |
Optimization |
|
|
gamozo has contributed to 4 posts out of 21176 total posts
(0.02%) in 4,015 days (0.00 posts per day).
20 Most recent posts:
Just tick-level on pretty much all the top 20 most traded futures. I think @ES# has about 55 million datapoints at tick level.
I pull at ticklevel as I postprocess all the data and frequently use tick data for simulated fills rather than n-length bars.
It's not a huge issue, I usually just pull once and then have the files around forever. I'm just requesting the feature if it's easy to add as it would be nice here and there.
-Brandon
Yep. That's what I currently do. However that only works with multiple symbols. It would be nice to have one symbol at full speed. And the only way I could think of doing this is if I knew how many ticks there were historically, with that I could start multiple queries at different starting points. I guess I could do it in chunks of 100k trades or so and just guess for now, but it'd be nice to know how many trades to expect.
Is there any chance we could get multi-threaded decompression in IQConnect? I'm bottlenecking on CPU rather than download bandwidth.
Currently I just pull all my symbols at once, which means I bottleneck on bandwidth which makes me happy. However I would like to pull a single symbol at full network bandwidth.
Another thing that could allow me to implement this on my end is if you add a 'get number of ticks', so I could query how many ticks of history there is for a symbol. Once I get that number I could start at trades/threads offsets and do a multi-threaded pull.
Any chance this could happen soon?
-Brandon Edited by gamozo on Sep 19, 2013 at 05:22 PM
I really do not like working with text-based protocols, as they're clumsy, expensive to parse, and take longer to write parsers for.
I'm wondering if there could be a mode added where data will be sent in binary fixed-width structures rather than CSVs. It pains my love for optimization to know that you guys take in binary data, convert it to CSV, and I convert it right back the second I read it.
I know it's probably something that most people do not ask for, but I can't imagine it would take too long for you guys to implement it.
-Brandon Edited by gamozo on Apr 2, 2013 at 08:20 PM
|
|