1

In a sqlite3 database, I've a table "data" with two fields: type and path. The field type is defined as INTEGER. In this field I insert a NSUInteger value (which will be for example 0 or 1). The problem is that, when I retrieve it, I obtain a "strange" value. I don't know where I'm wronging.

   if (init_statement == nil) {
        const char *sql = "SELECT type,path FROM data WHERE id=?";
        if (sqlite3_prepare_v2(database, sql, -1, &init_statement, NULL) != SQLITE_OK) {
            NSAssert1(0, @"Error: failed to prepare statement with message '%s'.", sqlite3_errmsg(database));
        }
    }

    sqlite3_bind_int(init_statement, 1, primaryKey);

    if (sqlite3_step(init_statement) == SQLITE_ROW) {
     int type = (int)sqlite3_column_text(init_statement, 0);
     char *relPath = (char *)sqlite3_column_text(init_statement, 1);

    // other stuff
    } 

    // Reset the statement for future reuse.
    sqlite3_reset(init_statement);
Sefran
  • 375
  • 6
  • 24

1 Answers1

1

SQLite allows only 64 bit signed integers. You are assigning it an unsigned integer. Change it to NSInteger instead.

Deepak Danduprolu
  • 44,595
  • 12
  • 101
  • 105