Forums

CSV Export downloading zero bytes file

timespro 25 Sep, 2014
I am facing issue while downloading the database table data in CSV format, CSV Export downloading 0 bytes file for big database. I would like to know if there is a limit to the size of the database for CSV files to download, if so, is there a workaround? or any possible solution?

Another form with 3423 entries is downloading the csv file perfectly alright, the one with 27303 entries ain't downloading as desired.

Please reply at the earliest as this needs to be resolved quite urgently.
GreyHead 25 Sep, 2014
Hi timespro,

It's possible that the size of file that you are trying to create is just too big :-(

You can custom code a CSV Export for a big table using a loop to read the records in batches of say 1-2000 at a time. Or you can try using another MySQL tool to do the export. PHPMyAdmin will probably handle this OK.

Bob
chlodwik 26 Sep, 2014
I have the same issue, with a csv file of not more than 5000 rows. I am using Joomla 2.5.24 and chronoforms 4.0.1. Any suggestions what on what can cause the CSV export to have no data?
timespro 26 Sep, 2014
Thanks for the reply Bob, You are absolutely right about the fact that the database table is indeed way too huge, has around 27000 odd entries and keeps growing everyday. I like your suggestion of custom coding and making the CSV export download 1-2000 entries at a time. I would really appreciate if you could help me with achieving this because only this can be a viable long term solution.
As of now i increased the php_value memory_limit from 128MB to 256MB in the .htaccess file as a stopgap measure for increasing the download limit for database files, but i can safely predict that it will get exhausted soon.

I would really appreciate if you could help me with tweaking the plugin so that it downloads csv files in chunks.
timespro 29 Sep, 2014
Please reply to the above query as soon as possible because its affecting my workflow.
GreyHead 29 Sep, 2014
Hi timespro,

Sorry for the delay. I've been looking to see if I can still find the code I used for this a few years back - so far no luck, I've also looked at the code for my action and - as I feared - there isn't a quick fix. The sequence of the code has to be changed quite a lot to support reading the data in chunks*. I'l try to look at this in the next couple of days.

Bob

* Basically as it is written now it gets all of the data, saves it in memory, then writes the file. To handle multiple reads the writing code has to be moved back into the code to get the data. As a one off that's not too difficult - adding it to the action means that you have to look at multiple use cases so that adding this option doesn't break the others.
timespro 30 Sep, 2014
Thanks for the reply Bob, it would be of huge help if you can figure this out or any other alternative long term solution at the earliest because i predict it to get exhausted very soon.
timespro 06 Oct, 2014
Hi Bob,

Just pointing this out again, please understand the magnitude of my situation and help me with it.
The resources are gonna dry up soon and i wouldn't like to end up in this situation.

Thanks in anticipation.
This topic is locked and no more replies can be posted.