File this one under ‘geeky’ and ‘stuff that took me a while to find out so I’m sharing it with others’.
Image CC BY-SA Kevin Severud @ Flickr
During the couple of days I’ve been off work ill this week I’ve transferred teaching.mrbelshaw.co.uk from a UK-based webhost to Bluehost (which hosts this, among other sites). It was about time my former blog (active 2005-2007) had some TLC as it was becoming progressively broken.
I had a 42MB MySQL database backup – the file that contains all of the blog’s important information (post and comment text, etc.) – but every time I tried to import this into a new database at Bluehost I kept getting timeout errors. It was then that I remembered I’d had this problem before and I’d managed to solve it with some sort of script that breaks the file up into smaller chunks to feed to the database incrementally.
After a while searching, I came across it again. It’s called BigDump and the process, if you’re familiar with installing WordPress manually, is fairly straightforward:
- Go into phpMyAdmin and execute DROP_TABLES on your target database.
- Download bigdump.zip from http://www.ozerov.de/bigdump.php and extract the zip file.
- Open bigdump.php using Notepad, TextEdit, or similar. Edit the relevant lines to point towards your database, username and password.
- Create a folder called dump on your web server and upload both bigdump.php and your MySQL database into that folder.
- CHMOD the folder recursively to 777 (i.e. give read/write permissions when accessed via the web)
- Access the script via (e.g.) http://yourdomain.com/dump/bigdump.php
- Follow the instructions!
This should lead to your database backup being successfully inserted into your new database. You can then use the data in whatever web app (i.e. WordPress) that you want! 🙂