You can export to a delimited file from Access, Excel, etc. Then you can upload these directly to a postgres table that has the correct format of course (IE. field size/type, number of fields, etc.). I have done this successfully with large tables that would have been a nightmare to transfer otherwise. I have even uploaded tables from Excel to Postgres. Some documentation is at:
http://www.sql.org/sql-database/post.../sql-copy.html
Here is an excerpt:
Name
COPY -- copy data between files and tables
Synopsis
COPY table [ ( column [, ...] ) ]
FROM { 'filename' | stdin }
[ [ WITH ]
[ BINARY ]
[ OIDS ]
[ DELIMITER [ AS ] 'delimiter' ]
[ NULL [ AS ] 'null string' ] ]
COPY table [ ( column [, ...] ) ]
TO { 'filename' | stdout }
[ [ WITH ]
[ BINARY ]
[ OIDS ]
[ DELIMITER [ AS ] 'delimiter' ]
[ NULL [ AS ] 'null string' ] ]
Inputs
table
The name (possibly schema-qualified) of an existing table.
column
An optional list of columns to be copied. If no column list is specified, all columns will be used.
filename
The absolute Unix path name of the input or output file.
stdin
Specifies that input comes from the client application.
stdout
Specifies that output goes to the client application.
BINARY
Changes the behavior of field formatting, forcing all data to be stored or read in binary format rather than as text. You can not specify DELIMITER or NULL in binary mode.
OIDS
Specifies copying the internal object id (OID) for each row.
delimiter
The single character that separates fields within each row (line) of the file.
null string
The string that represents a NULL value. The default is "\N" (backslash-N). You might prefer an empty string, for example.
Note: On a copy in, any data item that matches this string will be stored as a NULL value, so you should make sure that you use the same string as you used on copy out.
Outputs
COPY
The copy completed successfully.
ERROR: reason
The copy failed for the reason stated in the error message.