Hey,
I’m working on a Python + PostgreSQL system where:
- User query → LLM generates SQL
- Data is fetched from PostgreSQL
- LLM processes data (including calculations/derivations) to generate the final answer
Main issue: achieving high accuracy on complex, multi-parameter queries (not just simple trends), especially when the system needs to combine multiple fields and perform calculations/inference similar to Gemini.
Problems:
- Slow response
- Need a free/open-source alternative to Gemini
- Want strong reasoning + calculation capability from the model
Questions:
- How can I improve accuracy and reasoning for complex, multi-parameter queries in this setup?
- Which free/open-source LLMs + architectures can match Gemini-level reasoning (including calculations and derived insights)?
Tech: Python, PostgreSQL
Any suggestions or real-world approaches would really help 🙏
[link] [comments]



