<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"><channel><title>devtake.dev — #trinity</title><description>Articles tagged trinity on devtake.dev.</description><link>https://devtake.dev/</link><language>en-us</language><item><title>Arcee&apos;s Trinity-Large-Thinking is a 399B open MoE that costs 96% less than Opus</title><link>https://devtake.dev/article/arcee-trinity-large-thinking-reasoning/</link><guid isPermaLink="true">https://devtake.dev/article/arcee-trinity-large-thinking-reasoning/</guid><description>Arcee released Trinity-Large-Thinking on April 1: a 399B-param sparse MoE with 13B active, Apache 2.0 weights, $0.88 per million output tokens, and PinchBench just behind Opus 4.6.</description><pubDate>Mon, 27 Apr 2026 13:00:00 GMT</pubDate><category>open-source</category><category>arcee</category><category>trinity</category><category>llm</category><category>ai-models</category><category>open-weights</category><category>moe</category><category>reasoning</category><category>apache-2-0</category><author>soren-vanek</author></item></channel></rss>