『Medical Superintelligence』のカバーアート

Medical Superintelligence

Medical Superintelligence

無料で聴く

ポッドキャストの詳細を見る

このコンテンツについて

The provided texts introduce a "bottom-up" approach to achieve domain-specific superintelligence in Language Models (LMs), contrasting with conventional "top-down" training. This method leverages Knowledge Graphs (KGs) to teach LMs by explicitly composing simple domain concepts into complex ones. The research validates this approach in the medical field, utilizing the Unified Medical Language System (UMLS) KG to generate a curriculum of 24,000 reasoning tasks, complete with thinking traces. The resulting QwQ-Med-3 model significantly outperforms existing models on the ICD-Bench evaluation suite, particularly on challenging tasks, demonstrating improved reasoning and expertise transfer. This work suggests a pathway towards compositional Artificial General Intelligence (AGI) and potentially more energy-efficient AI systems.

まだレビューはありません