Problems with Matlab Database Explorer and postgreSQL (JDBC)

1 次查看(过去 30 天)
I have created a table in postgreSQL with ~1500 columns, however as soon as I try to insert data (using fastinsert) via the JDBC I get a few java errors. I guess these are an indication to some memory overflow in the postgreSQL JDBC Connector?
at org.postgresql.jdbc2.AbstractJdbc2Statement$BatchResultHandler.handleError(AbstractJdbc2Statement.java:2762) at org.postgresql.core.v3.QueryExecutorImpl$ErrorTrackingResultHandler.handleError(QueryExecutorImpl.java:362) at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1999) at org.postgresql.core.v3.QueryExecutorImpl.flushIfDeadlockRisk(QueryExecutorImpl.java:1180) at org.postgresql.core.v3.QueryExecutorImpl.sendQuery(QueryExecutorImpl.java:1201) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:412) at org.postgresql.jdbc2.AbstractJdbc2Statement.executeBatch(AbstractJdbc2Statement.java:2929)
I am using the same Connection for database operations where it is working perfectly fine. While I was looking for the problem I discovered that the operation is working fine, when just NaNs are added to the table, but as soon as there are more than ~ 800 numbers instead of NaN the insert crashes. I also checked the datatypes in matlab and postgre, so there should be no reason for crashes there.
The Database can handle the data when I import csv files. So I am bit lost right now where I can find my problem or what I can do.

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Database Toolbox 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by