0

I couldnt find the tag for apsw module.

import sqlitefts as fts
import os

from search import *
from search import OUWordTokenizer

import sys
def tokenize():
        with apsw.Connection('texts.db', flags=apsw.SQLITE_OPEN_READWRITE) as connection:
        c = connection.cursor()
        print("connection to cursor")
        fts.register_tokenizer(c, 'oulatin', fts.make_tokenizer_module(OUWordTokenizer('latin')))
    #fts.register_tokenizer(c, 'porter')
        print("registering tokenizer")
        c.execute("CREATE VIRTUAL TABLE IF NOT EXISTS text_idx USING fts3 (id, title, book, author, date, chapter, verse, passage, link, documentType, tokenize={})".format("oulatin"))
        c.execute("CREATE VIRTUAL TABLE IF NOT EXISTS text_idx_porter USING fts3 (id, title, book, author, date, chapter, verse, passage, link, documentType, tokenize={})".format("porter"))
        print("virtual table created")
        c.execute("COMMIT")
        c.execute("BEGIN")
        c.execute("INSERT INTO text_idx(id, title, book, author, date, chapter, verse, passage, link, documentType) SELECT id, title, book, author, date, chapter, verse, passage, link, documentType FROM texts")
        c.execute("INSERT INTO text_idx_porter(id, title, book, author, date, chapter, verse, passage, link, documentType) SELECT id, title, book, author, date, chapter, verse, passage, link, documentType FROM texts")
        print ("inserted data into virtual table")
        c.execute("COMMIT")

        stmt1='select id, title, book, author, link from text_idx where passage MATCH "saepe commeant atque"'
        stmt2='select id, title, book, author, link from text_idx_porter where passage MATCH "saepe commeant atque"'

        r1=c.execute(stmt1)
        print (type(r1))
        r2=c.execute(stmt2)
        print (type(r2))
        r3=(set(r2).union(set(r1)))
        print (type(r3))
        r4=list(r3)
        print (type(r4))
        print (r4)

I am getting a segmentation fault when i run this

output:

bash-4.3# python3 app.py
connection to cursor
registering tokenizer
virtual table created
Segmentation fault (core dumped)

The code worked fine and i have made no changes since then i dont know why this happened. I am stumped.

UPDATE:

I have tried debugging the code with gdb and it says this.

(gdb) run app.py
Starting program: /usr/bin/python3 app.py
connection to cursor
registering tokenizer
virtual table created
During startup program terminated with signal SIGSEGV, Segmentation fault.

Now i understand that its not a problem with the code but with the shell or the wrapper. How do i approach this?

UPDATE:

fts.register_tokenizer(c, 'oulatin', fts.make_tokenizer_module(OUWordTokenizer('latin')))

is this the right way to register a user defined tokenizer?

UPDATE: I have debugged the code locally with gdb and it says this.

 (gdb) run app.py
 Starting program: /usr/bin/python3 app.py
 [Thread debugging using libthread_db enabled]
 Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
 * Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)
 * Restarting with stat 
 connection to cursor
 Traceback (most recent call last):
   File "app.py", line 52, in <module>
     tokenize()
   File "app.py", line 20, in tokenize
     fts.register_tokenizer(c, 'oulatin', fts.make_tokenizer_module(OUWordTokenizer('latin')))
   File "/usr/local/lib/python3.5/dist-packages/sqlitefts/tokenizer.py", line 191, in register_tokenizer
     r = c.execute('SELECT fts3_tokenizer(?, ?)', (name, address_blob))
   File "src/cursor.c", line 1019, in APSWCursor_execute.sqlite3_prepare
   File "src/statementcache.c", line 386, in sqlite3_prepare
   apsw.SQLError: SQLError: no such function: fts3_tokenizer
   [Inferior 1 (process 13615) exited with code 01]

This is making me think that the way i registered my tokenizer has some problems. Any thoughts?

Ram Charran
  • 131
  • 1
  • 3
  • 14

0 Answers0