We've been using a Wiki server at the office for years. It was
originally configured to use MySQL and finally after 8+ years we're
moving the Wiki to a new platform of hardware. My question is the Wiki
software (MediaWiki) is the only thing still tied to and using MySQL
which we want to decommission but we've been using it for years so I'm
worried we will lose the data. I've done some Google'ing to find out
how can I change the MySQL database dump and successfully export it
into my new PostgreSQL database however I don't know how practical or
recommended this process is. I found sites like the following:
http://en.wikibooks.org/wiki/Convert..._to_PostgreSQL
Can you guys tell me if this is something that will work? I don't mean
the exact link above but just in general taking a database from MySQL
and successfully migrating it for PostgreSQL use?
From what I can see in the MySQL database, there appears to be 43
tables with lots of column data and who knows what else:
Code:
mysql> show tables;
+----------------------+
| Tables_in_wiki |
+----------------------+
| dp_archive |
| dp_category |
| dp_categorylinks |
| dp_change_tag |
| dp_externallinks |
| dp_filearchive |
| dp_hitcounter |
| dp_image |
| dp_imagelinks |
| dp_interwiki |
| dp_ipblocks |
| dp_ipblocks_old |
| dp_job |
| dp_langlinks |
| dp_logging |
| dp_math |
| dp_objectcache |
| dp_oldimage |
| dp_page |
| dp_page_props |
| dp_page_restrictions |
| dp_pagelinks |
| dp_protected_titles |
| dp_querycache |
| dp_querycache_info |
| dp_querycachetwo |
| dp_recentchanges |
| dp_redirect |
| dp_revision |
| dp_searchindex |
| dp_site_stats |
| dp_tag_summary |
| dp_templatelinks |
| dp_text |
| dp_trackbacks |
| dp_transcache |
| dp_updatelog |
| dp_user |
| dp_user_groups |
| dp_user_newtalk |
| dp_valid_tag |
| dp_validate |
| dp_watchlist |
+----------------------+
43 rows in set (0.01 sec)