Join the 80,000 other DTN customers who enjoy the fastest, most reliable data available. There is no better value than DTN!

(Move your cursor to this area to pause scrolling)




"As a past ******* customer(and not a happy one), IQ Feed by DTN is a much better and cheaper product with great customer support. I have had no problems at all since switching over." - Comment from Public Forum
"Thank you so much - awesome feed, awesome service!" - Comment from Greg via Email
"I noticed that ******* quotes locked up shortly after the interest rate announcement yesterday while yours stayed stable." - Comment from Ron in Utah
"IQ feed is brilliant. The support is mind-bending. What service!" - Comment from Public Forum Post
"I cannot believe what a difference it makes trading with ProphetX!" - Comment from Bruce in Los Angeles
"Interactive Brokers tick data was inconsistent, so I have switched to using DTN exclusively. It is great to no longer have to worry about my datafeed all day long." - Comment from Philippe
"I've been using IQFeed 4 in a multi-threaded situation for the last week or two on 2600 symbols or so with 100 simultaneous daily charts, and I have had 100% responsiveness." - Comment from Scott
"You have an excellent feed. Very few spikes for Spot Forex." - Comment from Public Forum Post
"Can I get another account from you? I am tired of ******* going down so often" - Comment from George
"I am very pleased with the DTNIQ system for quotes and news." - Comment from Larry
Home  Search  Register  Login  Recent Posts

Information on DTN's Industries:
DTN Oil & Gas | DTN Trading | DTN Agriculture | DTN Weather
Follow DTNMarkets on Twitter
DTN.IQ/IQFeed on Twitter
DTN News and Analysis on Twitter
»Forums Index »Archive (2017 and earlier) »IQFeed Developer Support »how to insert large amounts of data
Author Topic: how to insert large amounts of data (2 messages, Page 1 of 1)

d_allen
-Interested User-
Posts: 29
Joined: Nov 5, 2008


Posted: Feb 25, 2011 04:05 PM          Msg. 1 of 2
Hi,
I'm building a multithreaded app that's using the producer consumer queue threading logic...I'm pushing the feed data into a queue which a worker thread processes (inserts and updates database tables). The problem is towards the end of the trading session the volume becomes VERY heavy and the queue builds at a faster rate than the worker thread can process causing latency with the inserts by 2 or 3 mins!

Does anyone have any solutions or suggestions on how to flush the entire queue into the table vs. individual inserts or some other way to resolve this?

By the way, I'm using MySQL 5.5 with .net connector (API) and C# 4.0

Thanks,
D.Allen
Edited by d_allen on Feb 25, 2011 at 04:06 PM

AMS
-Interested User-
Posts: 12
Joined: Feb 21, 2011


Posted: Feb 25, 2011 06:13 PM          Msg. 2 of 2
This is a bit generic,
but when writing to a DB the only way I know of to make things faster is to use transactions and commit every 1000 or 10,000 or so rows, this is how its done for ETL jobs.

If you are using mysql you may also look into using their multi-insert insert into syntax
>> INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
(copied from http://dev.mysql.com/doc/refman/5.5/en/insert.html )

Also consider if possible dropping any primary keys and indexes constraints before the insert and building them afterward
 

 

Time: Fri May 17, 2024 2:41 AM CFBB v1.2.0 10 ms.
© AderSoftware 2002-2003