-
save CQs to DB
-
get conversations by joining table conversations and message_store
-
Implement interaction between user and LLM through langchain for validating or revising generated CQs
-
Implement generated CQs storing to a database
-
create upload PDF handler
-
create PDF handler to ectract texts out of PDF and then process using flair or something else to get the important terms
-
fix ChatHistories class implementation
-
create web scraping handler
-
create handler to extract texts out of web scraping and then process using flair or sumn else
-
fix prompt to awan llm domain and scope body
-
create terms table to store important terms
-
generate classes, object properties, and data properties out of important terms using LLM
-
scope in step 3 is actually using domain within conversation
-
fuse important terms and class and instance generation into the same endpoint
-
change the flow of authentication
-
consider hadnling the logic on a higher level (e.g. in the service layer)
-
refactor the code to make it more modular, readable, and maintainable
-
saved CQ should be validated instantly
-
add time counter for each process in terms extractor (pdf/url)
-
MAKE TEMRS EXTRACTOR WORK FASTER CUZ CF ONLY WILL WORK FOR 100 SECONDS
-
implement PUT and GET methods for important terms
-
implement PUT method for CQs
-
update generate module endpoints in API docs
-
fix issue of the PDF handler not working (file not uploaded) on production
-
saves object and data properties to DB
-
create class and properties data junctions DB command and query
-
add logout endpoint
-
add create classes, object properties, and data properties endpoints
-
add API documentations for saving class and data properties endpoints
-
implement OWL file generation for the ontology system
-
implement add range and domain endpoints
-
fix bug in llmsherpa daemon
-
(optionally) add array of ranges, domains, object properties, data properties, classes, instances to class endpoint
-
buat step 7 yg instances. get by conv_id sama post instances by conv_id yak (post already using conv_id to generate instances from)
-
change implementation of step 2
-
fix step 2 repeatedly generating irrelevant domain and scope existing ontologies
-
replace awanllm with openai
-
upload pdf/url and add new class replace existing data instead of creating new ones
-
weigh in whether to use UUID, ULID, or instead serial as PK (or even keep serial + UUID/ULID) -> maybe watch Hussein's video
-
Implement logging
-
restructure code (e.g. handlers to one new specific file)
-
Implement error handling for API calls
-
better error log and response per route
-
secure routes using oauth session key
-
Implement user authorization and session management
-
update README
-
add explanation in README to install Spacy model beforehand with this command
python -m spacy download en_core_web_sm
-
handle text chunking for AwanLLM request to be optional if the text lenght is more than a certain threshold
-
fix issue with LLM output format
- use poetry
- implement DB migration