Overview
Examples
Screenshots
Comparisons
Applications
Download
Documentation
Tutorials
Bazaar
Status & Roadmap
FAQ
Authors & License
Forums
Funding Ultimate++
Search on this site
Search in forums












SourceForge.net Logo
Home » U++ Library support » U++ SQL » Acquiring large record
Acquiring large record [message #23017] Tue, 08 September 2009 00:02 Go to next message
Mindtraveller is currently offline  Mindtraveller
Messages: 917
Registered: August 2007
Location: Russia, Moscow rgn.
Experienced Contributor

Is there a way to acquire large record (300+ MB) not putting it into memory at once? It would be great to make something like stream to get data through relatively small data chunks.
Re: Acquiring large record [message #23027 is a reply to message #23017] Tue, 08 September 2009 23:58 Go to previous messageGo to next message
Mindtraveller is currently offline  Mindtraveller
Messages: 917
Registered: August 2007
Location: Russia, Moscow rgn.
Experienced Contributor

The problem appears when one tries to hold/fetch i.e. large files in/from database. Not going into detail, the maximum allowed SQL transaction size is commonly no more than 1 MB.
Just in case someone needs to store large blobs in a database too, publishing a solution. The idea is to have separate table with blob column. So you divide your large file into ~512 KB pieces and keep them as records sequence in the table. Each time you need this big blob, you have to fetch it with many transactions, piece by piece.
Re: Acquiring large record [message #23045 is a reply to message #23027] Thu, 10 September 2009 14:05 Go to previous message
mirek is currently offline  mirek
Messages: 13975
Registered: November 2005
Ultimate Member
Mindtraveller wrote on Tue, 08 September 2009 17:58

The problem appears when one tries to hold/fetch i.e. large files in/from database. Not going into detail, the maximum allowed SQL transaction size is commonly no more than 1 MB.



Depends on DB...

Quote:


Just in case someone needs to store large blobs in a database too, publishing a solution. The idea is to have separate table with blob column. So you divide your large file into ~512 KB pieces and keep them as records sequence in the table. Each time you need this big blob, you have to fetch it with many transactions, piece by piece.



Well, there is a support for BLOBs for Oracle. These AFAIK tend to be quite DB specific I am afraid and nobody investigated/implemented these things in other DBs. But when you do, I guess looking on how it is done (interface-wise) in Oracle should help...

Previous Topic: Postgres library improvements
Next Topic: Postgres and CurrVal
Goto Forum:
  


Current Time: Fri Apr 19 23:56:36 CEST 2024

Total time taken to generate the page: 0.04514 seconds