[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Omaha.pm] Mapping Perl structures to a SQL table...



If your data is all hashes, perhaps what you ought to look into is KiokuDB, since it stores hashes very efficiently. If you can turn your hashes into Moose classes while you're at it, all the better.

Cheers,
Sterling

On Fri, Oct 30, 2009 at 1:31 PM, Dan Linder <dan@linder.org> wrote:
I'm taking on the task of converting our in-house tool to use the Perl
DBI module to replace the Data::Dumper/eval() it currently uses to
store and retrieve data.  Not pretty, but it has worked pretty well
for the small data sets we've been using.

We now have some people commenting on the speed - some have pages take
7+ minutes to bring up waiting for the back-end perl code to ripple
through the directory structure and eval() the necessary files to
build the page.  The "eval" function seems to be the bulk of the time
as I expected...

What I'm looking for is some general comments and discussion about the
mental task of mapping these hash tables into a SQL table.  I'm not
really looking for a tool, more a high level discussion about ways to
store the data and still remain flexible.

Dan

--
******************* ***************** ************* ***********
******* ***** *** **
"Quis custodiet ipsos custodes?" (Who can watch the watchmen?) -- from
the Satires of Juvenal
"I do not fear computers, I fear the lack of them." -- Isaac Asimov (Author)
** *** ***** ******* *********** ************* *****************
*******************
_______________________________________________
Omaha-pm mailing list
Omaha-pm@pm.org
http://mail.pm.org/mailman/listinfo/omaha-pm



--
Andrew Sterling Hanenkamp
sterling@hanenkamp.com
785.370.4454