Articles


Papers

PaperAuthorsPublishedDOI
Semantic Tension Language (STL): A Theoretical Framework for Structured and Interpretable Knowledge RepresentationWukoNov 202510.5281/zenodo.17585432

Reports

ArticleTopicKey Finding
Context Compression for LLMsLLM MemorySTL achieves 1.76x better token efficiency than auto-compacted NL (up to 10x vs verbose prose)
STL Retrieval Robustness in Long-Context LLMsLong-Context RetrievalSTL achieves +2.8% higher retrieval accuracy than NL across 1,507 trials; +50% on precise numerical data (N06: NL=0.50, STL=1.00)
STL as IR for LLM Code GenerationVibe CodingSTL achieves 68% feature completion vs NL 45% vs STLC 53%; only format with zero human intervention and first-attempt compilation
STG as Skill LibraryAgent SkillsUnifying declarative and procedural knowledge in a single graph — recall-to-action via propagation, with Hebbian reinforcement and natural deprecation

Coming Soon

  • STL vs JSON for AI Applications — When structured semantics beats key-value pairs
  • Building Persistent AI Memory — Cross-session knowledge accumulation with STL
  • The Tension-Path Model — Why directional relations matter for knowledge representation