vendredi 8 janvier 2016

Spark SQL: Sqlite - Unparseable date

I try to process a database with Apache Spark. But I get this unparseable date exception. I read all the other questions about this exception here on SO, but they don't help me. The interesing part of my code is this:

try {
  val tableData = sqlContext.read.format("jdbc")
                            .options(Map("url" -> databasePath,
                                         "dbtable" -> t))
                            .load()
  if (tableData.count > 0) {
    val df = tableData.map(r => sqlMapFunc(r,t))                    
  }

} catch {
  case s: SQLException => println("DEBUG, SKIPPING TABLE " + t)
  case e: Exception => println("EXCEPTION CAUGHT: " + t); System.exit(1)
}

So although I get java.sql.SQLException: Error parsing date I cannot catch this exception, it always takes the second case statement.

While catching the exception and just skipping the table would be a good start I am more interested in making it actually work. But I never call Date.parse manually so I don't know how to apply the answers from the other questions.

More output:

Caused by: java.text.ParseException: Unparseable date: "2009-01-01 00:00:00" does not match (\p{Nd}++)\Q-\E(\p{Nd}++)\Q-\E(\p{Nd}++)\Q \E(\p{Nd}++)\Q:\E(\p{Nd}++)\Q:\E(\p{Nd}++)\Q.\E(\p{Nd}++)

Aucun commentaire:

Enregistrer un commentaire