develooper Front page | perl.beginners | Postings from October 2009

Re: Limit on number of columns pulled using DBI::ODBC

Thread Previous | Thread Next
From:
Tony Esposito
Date:
October 5, 2009 07:34
Subject:
Re: Limit on number of columns pulled using DBI::ODBC
Message ID:
282327.30344.qm@web24616.mail.ird.yahoo.com
Oracle does not have a LIMIT clause ... the best that can be done is to use row_num but that can get 'funky'.  Basically you are saying that I get a subset (a block) of data at a time and then repeat the process until all rows are retrieved.  Is that correct?  Do have an Oracle example of using row_num instead of LIMIT (which I believe is MySQL)?
 
thanks

--- On Wed, 30/9/09, Steve Bertrand <steve@ibctech.ca> wrote:


From: Steve Bertrand <steve@ibctech.ca>
Subject: Re: Limit on number of columns pulled using DBI::ODBC
To: "Tony Esposito" <tony1234567893@yahoo.co.uk>
Cc: "Beginners Perl" <beginners@perl.org>
Date: Wednesday, 30 September, 2009, 10:14 PM


Tony Esposito wrote:

Please try to fix your email application (Yahoo) to wrap lines at ~76
chars per line ;)

> I am trying to retrieve all columns and some (not all) rows from an Oracle table which contain - among other things - 2 CLOB columns and print this to a flat file (ASCII file).  
> 
> As I "tune" $dbh->{LongReadLen} = 20000 I can varying results from 'Out of Memory!" errors to not all data retrieved to a Perl Interpreter Error Window, which asks me if I want to send the error contents to Microsoft.  

It seems you are setting a global variable for something that you should
limit within a smaller scope.

Again, without knowing what your statement is, it's hard to say, but if
I were running into memory issues, I'd rebuild my SQL query as such:

my $sth = ("
        SELECT *
        FROM table_name
        LIMIT 5000
    ") or die $dbh->errstr;

imho, global catchalls should be used as a *very last* resort to trap
problems that will cause fatal OS errors.

By using the 'LIMIT' statement, you can get what you want, write it to a
file, de-allocate the memory, get the row id that you left off at, and
then start over again by grabbing another 'limit' number of rows and
appending that chunk to the file.

If you need further help with doing this, at least reply with what your
auto-increment is called, and what your full statement is.

I'll help you write a function to 'chunk' things up.

There are two possible outcomes to that:

- I respond with code for you and it works or it doesn't

- someone else responds with code for you that works

and

- the code that you are responded with is reviewed and criticized
heavily, and what falls out the bottom of the thread is much better and
more intriguing than you could have imagined ;)

Steve


-- 
To unsubscribe, e-mail: beginners-unsubscribe@perl.org
For additional commands, e-mail: beginners-help@perl.org
http://learn.perl.org/




Send instant messages to your online friends http://uk.messenger.yahoo.com 
Thread Previous | Thread Next


nntp.perl.org: Perl Programming lists via nntp and http.
Comments to Ask Bjørn Hansen at ask@perl.org | Group listing | About