Mindtraveller Messages: 917 Registered: August 2007 Location: Russia, Moscow rgn.
Experienced Contributor
Is there a way to acquire large record (300+ MB) not putting it into memory at once? It would be great to make something like stream to get data through relatively small data chunks.
Mindtraveller Messages: 917 Registered: August 2007 Location: Russia, Moscow rgn.
Experienced Contributor
The problem appears when one tries to hold/fetch i.e. large files in/from database. Not going into detail, the maximum allowed SQL transaction size is commonly no more than 1 MB.
Just in case someone needs to store large blobs in a database too, publishing a solution. The idea is to have separate table with blob column. So you divide your large file into ~512 KB pieces and keep them as records sequence in the table. Each time you need this big blob, you have to fetch it with many transactions, piece by piece.
Mindtraveller wrote on Tue, 08 September 2009 17:58
The problem appears when one tries to hold/fetch i.e. large files in/from database. Not going into detail, the maximum allowed SQL transaction size is commonly no more than 1 MB.
Depends on DB...
Quote:
Just in case someone needs to store large blobs in a database too, publishing a solution. The idea is to have separate table with blob column. So you divide your large file into ~512 KB pieces and keep them as records sequence in the table. Each time you need this big blob, you have to fetch it with many transactions, piece by piece.
Well, there is a support for BLOBs for Oracle. These AFAIK tend to be quite DB specific I am afraid and nobody investigated/implemented these things in other DBs. But when you do, I guess looking on how it is done (interface-wise) in Oracle should help...