Ceivo March 2026 — Real-Time Notifications, Smarter Tags, and ChatGPT-Ready MCP

A big month for Ceivo. March brought four releases packed with real-time notifications, a redesigned sharing experience, typed tags that understand people, brands, and locations, a dramatically richer diagnostics dashboard, and MCP connector support for ChatGPT and other third-party AI tools.

March was a busy month for Ceivo. Across four releases, we shipped a wave of features that push the platform further into three places customers have been asking for: real-time collaboration, smarter AI metadata, and open integrations with the agent ecosystem. Here's the tour.

Real-time notifications and a reimagined sharing experience

Sharing files in Ceivo is now a proper collaboration surface, not just a link-generator.

A brand-new Notifications Panel shows you activity as it happens — when files are shared with you, when your own shared files are accessed, and other updates as they flow through the platform. Pop-up alerts make sure you never miss anything, and a notification badge in the sidebar keeps an unread count right in front of you.

Sharing itself has been reworked from the ground up:

  • Share Notes — add a personal note when you share a file so recipients know exactly why you sent it.
  • Access tracking — see which recipients opened a share link and how many times they downloaded the files.
  • Guest view — the people you share with can view and download directly in the browser, no account required.
  • Email sharing — enter recipient email addresses directly in the share dialog and Ceivo sends a branded email with a link.

For teams that live in Ceivo, this turns the library from a one-way delivery mechanism into a live, trackable surface for day-to-day work.

Typed tags that actually understand your content

Ceivo's AI analysis used to return a flat list of tags. Not anymore. March's Smarter Tag Recognition update categorizes every tag by type — people, brands, locations, and more — and gives each type its own color in the UI.

That sounds cosmetic. It isn't. Typed tags mean:

  • You can search for "Coca-Cola" and get the brand, not the word in a transcript.
  • You can filter a library by location without wading through a wall of generic tags.
  • Downstream integrations and agents receive structured metadata they can actually reason over.

It's a step toward metadata that matches how media teams actually think about their content, and it unlocks a lot of the richer workflows we've been previewing at HPA and in partner demos.

MCP connector now speaks ChatGPT (and more)

One of the most important releases of the month was quiet but strategically huge: Ceivo's MCP connector now supports dynamic authentication, which means you can plug Ceivo into ChatGPT and similar third-party AI tools directly. Point an agent at your Ceivo library, let it search and reason over your content, and Ceivo's governance layer stays in charge of what it can see and do.

If you've been following our MCP story — this is the moment it becomes broadly usable outside the Ceivo-native experience.

Supporting that, we also shipped AI Model Insights in the diagnostics page: admins can now see a breakdown of exactly which AI models were used for scene descriptions and embeddings across the library. Visibility, provenance, and cost control — the three things enterprise media operations need before they'll turn AI loose on their archive.

A much richer Diagnostics Dashboard for admins

For the admins running Ceivo on behalf of their organization, the Diagnostics Dashboard got a serious upgrade. It now includes dedicated tabs for:

  • Library — file and scene statistics, coverage gaps, files with missing descriptions
  • Processing — the health of analysis jobs across the platform
  • Search & AI — which models are running where, how search is performing
  • User Activity — who's doing what inside the organization

Combined with the existing ability to trigger bulk analysis on filtered content, this gives admins a single operational view of their Ceivo deployment — the kind of visibility that used to require stitching together three different dashboards.

Playlists: preview, trim, and name your clips

The playlist editor kept evolving, too. You can now:

  • Preview playlists directly in the app before rendering them, with playback controls and a segment timeline strip.
  • Trim individual segments inside the playlist with a new cut button.
  • Add rich metadata — title, description, and more — so playlists are easier to organize.
  • Create named clips straight from player selections, giving them a proper name the moment they're created.

Playlists are now a core, always-on feature for every user — no feature flag required.

Top 10 notable improvements

Beyond the headline features, ten smaller-but-noticeable improvements worth calling out:

  1. Season & Episode search — find a specific episode in your library by season and episode number.
  2. Custom file names — rename files with custom display names anywhere in the platform.
  3. Custom thumbnails — upload your own thumbnail image for any file or folder.
  4. Redesigned folder browser — a faster, cleaner folder picker for navigating and moving content.
  5. 8x playback speed — scrub through long-form content even faster in the player.
  6. Persistent search highlights — matched search terms stay highlighted as you drill into a file's details.
  7. Cleaner dashboard — playlist output files no longer clutter the main dashboard view.
  8. One-click Copy API URL in the API documentation page.
  9. More consistent API responses — empty lists instead of missing fields, making integrations more reliable.
  10. Automatic database credential rotation — security hardening that happens silently in the background.

Stability, reliability, and the quiet work

As always, the month also included a long list of stability, dependency, and reliability fixes — from smarter audio transcript handling through the API, to cleaner re-analysis (old tags are now properly replaced instead of piling up), to login styling isolation. None of these will make a highlight reel, but they're the reason the platform keeps getting faster, safer, and easier to live in every week.

What's next

April is already shaping up to be our biggest month yet — we'll be at NAB Show 2026 in Las Vegas showing off much of what shipped in March, plus the new TwelveLabs integration, live on the Signiant "Connected Intelligence" booth (Stand W2131, West Hall).

Want early access to something you saw above, or curious how a specific feature could work for your team? Reach out and we'll take it from there.

Ready to see your entire media universe?

Connect a few sources and get instant visibility—no migration needed.