mastodon.online is one of the many independent Mastodon servers you can use to participate in the fediverse.
A newer server operated by the Mastodon gGmbH non-profit

Server stats:

10K
active users

#giselleai

2 posts1 participant0 posts today
Tadashi Shigeoka<p>Giselle now supports OpenAI’s GPT-4.1 series, enabling smarter, faster AI agents with enhanced coding, context, and instruction skills.</p><p><a href="https://giselles.ai/blog/giselle-now-supports-openai-gpt-4-1-series" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">giselles.ai/blog/giselle-now-s</span><span class="invisible">upports-openai-gpt-4-1-series</span></a></p><p><a href="https://mastodon.social/tags/OpenAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenAI</span></a> <a href="https://mastodon.social/tags/GPT41" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT41</span></a> <a href="https://mastodon.social/tags/AIAgent" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIAgent</span></a> <a href="https://mastodon.social/tags/GiselleAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GiselleAI</span></a></p>
Tadashi Shigeoka<p>OpenAI’s GPT-4.1 series is here, boasting major upgrades in coding, instruction following, and a 1M token context window! 🎉 </p><p>Giselle’s AI App Builder has integrated these models, empowering developers to create faster, smarter apps. Start building today!</p><p>🔗 PR: <a href="https://github.com/giselles-ai/giselle/pull/708" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">github.com/giselles-ai/giselle</span><span class="invisible">/pull/708</span></a></p><p><a href="https://mastodon.social/tags/GPT41" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT41</span></a> <a href="https://mastodon.social/tags/OpenAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenAI</span></a> <a href="https://mastodon.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a><br><a href="https://mastodon.social/tags/GiselleAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GiselleAI</span></a></p>
Tadashi Shigeoka<p>Giselles.ai is currently developing MCP-related features! <br>As part of this, our team enabled supabase-mcp to handle the `SUPABASE_ACCESS_TOKEN` environment variable. Available now in v0.3.2! ⚡️</p><p>Check it out: <a href="https://github.com/supabase-community/supabase-mcp/releases/tag/v0.3.2" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">github.com/supabase-community/</span><span class="invisible">supabase-mcp/releases/tag/v0.3.2</span></a></p><p><a href="https://mastodon.social/tags/Supabase" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Supabase</span></a> <a href="https://mastodon.social/tags/MCP" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MCP</span></a> <a href="https://mastodon.social/tags/GiselleAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GiselleAI</span></a> <a href="https://mastodon.social/tags/OpenSource" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenSource</span></a></p>
Tadashi Shigeoka<p>We've just enabled Vercel Fluid compute on our product—unlocking a host of benefits!</p><p>🔗 Fluid compute <a href="https://vercel.com/fluid" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">vercel.com/fluid</span><span class="invisible"></span></a><br>🔗 Check out our PR for the details: <a href="https://github.com/giselles-ai/giselle/pull/376" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">github.com/giselles-ai/giselle</span><span class="invisible">/pull/376</span></a></p><p>In our Giselle's Agentic workflows, the maximum execution time has jumped from 300 to 800 seconds, allowing us to run processes on Vercel for over 13 minutes.</p><p>We’re thrilled about the upgrade and wish we had switched it on sooner.</p><p><a href="https://mastodon.social/tags/Vercel" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Vercel</span></a> <a href="https://mastodon.social/tags/FluidCompute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>FluidCompute</span></a><br><a href="https://mastodon.social/tags/GiselleAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GiselleAI</span></a> <a href="https://mastodon.social/tags/AIAgent" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIAgent</span></a> <br> <a href="https://mastodon.social/tags/AgenticWorkflow" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AgenticWorkflow</span></a></p>