An Agent-first Web Service Platform based on Open Source

Eyevinn Open Source Cloud (OSC) makes it easier to use open source in solutions and quickly turn ideas into working implementations. Works with AI assistants and development environments to leverage open source effectively. The OSC Architect helps design solutions by combining open web services with large language models, and IDE integration to assist with developing the connections between open web services.

Eyevinn Open Source Cloud (OSC) not only reduces the barrier to use open source in a solution, but it also reduces the barrier to go from idea to a full working solution. With OSC integrated with your AI-assistant and AI-powered development environments (IDE) they can help you utilize the full power of open source when developing your solution.

With the knowledge base of available open web services and their potential combined with large-language models the OSC Architect supports you designing a solution. And having the OSC Architect integrated in the IDE it can assist you with developing the “glue” that connects the open web services in the solution.

A Model Context Protocol remote endpoint is available to provide OSC context and the ability to administrate your running services in OSC using your favorite AI application (or IDE for that matter).

Some examples of what the OSC Architect can assist you with are:

  • Suggest code to setup a NoSQL database running in the cloud based on open web service in OSC, and the code the manage the documents in the database (create, update, read and delete).
  • Suggest code for uploading and handling large files in a bucket on a open storage service in OSC.
  • Help you design a solution for handling user registration and authentication
  • Guide you to building a solution for automatically synchronizing a RAG vector store with new and updated documents.
  • Develop a solution for preparing and creating video files for VOD streaming including automatic subtitle generation.

To integrate OSC in your AI application using Model Context Protocol add the following MCP server to your application’s MCP configuration.

{
  "mcpServers": {
    "remote-mcp-osc": {
      "command": "npx",
      "args": ["-y", "@osaas/client-mcp"],
      "env": {
        "OSC_ACCESS_TOKEN": "[osc-access-token]"
      }
    }
  }
}

You find the OSC access token in the Open Source Cloud web console under Settings / API.

And to add the OSC Architect to GitHub Copilot chat in Visual Studio Code install the OSC Architect Chat Extension.

With OSC in the center of your development environment you can fully utilize the great power of open source out there and a solution free from vendor lock-in.

Open Source Cloud architect under the hood

In this blog post we till take a look under the hood of Open Source Cloud (OSC) and more specifically how our OSC architect is build. The purpose with the OSC architect is to guide and help business developers, solution architects and developers to build solutions based on open web services. Solutions that are based on open source offers flexibility and freedom from vendor lock-in.

The OSC architect is available in the web console or as a GitHub copilot chat participant in Visual Studio Code.

Retrieval-Augmented Generation

Our OSC architect is built on latest GPT large-language models and to produce answers more accurate for an OSC user we use RAG as the AI architecture. How it works is that when a question is asked, the system first searches through a database containing knowledge of OSC (guides, SDK documentation, blog posts, etc). This retrieved information is augmented and added to the context or prompt before passing it to the language model. Then the GPT model generates a response based on both the original query and the retrieved context.

Continuously updating the knowledge database

The knowledge database is a vector store of documents that is uploaded. For example markdown files of the documentation, HTML files for the SDK reference and blog posts. To maintain a database that is up to date with latest and relevant information we need an automatic way to update the vector store.

The document files from the various sources are uploaded to an S3 bucket and then we use an open web service in OSC to synchronize the contents of an S3 bucket with the files in the vector store. If a file is not available in the store it will be uploaded and if a file in the store is no longer on the S3 bucket it will be removed from the vector store.

This synchronization is triggered by a GitHub action workflow that is executed when the documentation site is updated for example. When the vector store is synchronized with the S3 buckets the knowledge base for the RAG has been updated.

This open web service and concept can be applied to other contexts where you want to augment the responses from a language model with domain specific knowledge. To get help in setting this up you can read the user manual for the “S3 sync vector store” open web service or ask the OSC architect for help.

OSC architect in your VS Code IDE

Based on above we offer a chat extension to Visual Studio Code where the developer can get help from the OSC architect when developing solutions. Install the extension and let the Copilot participant called “@osc” assist you in building solutions based on open web services. In this example I asked the OSC architect for help in storing Common Access Tokens in a database.

In addition to the knowledge base that is augmented this extension prompts the OSC architect to enhance the response with code examples as that is relevant in the context of a developer environment.

We are continuously improving the OSC architect as we augment the model with a larger knowledge base and on the roadmap is to support remote MCP (Model Context Protocol) to enable a seamless integration with AI agents and AI chat applications. So stay tune for more!

Stronger Independence with Open Web Services

Do you feel it is time to start considering how to be less dependent on one single provider of cloud web services for your solution but don’t know how or where to start?

In this article we give you a starting point by describing how to move the message queue service in your solution to a service based on open source instead. With a service based on open source you can at a later step move this to your own infrastructure.

Open source as a service

The SmoothMQ project is a message queue that is a drop-in-replacement for SQS. It is open source and there is nothing preventing you from hosting it in your own private or public cloud infrastructure. To reduce the barrier to move to a solution based on SmoothMQ and open source in general we developed Eyevinn Open Source Cloud. A service where open source projects are made available as a service, an open web service, and SmoothMQ is one of those.

This makes it possible for you to start shifting your messaging over to open source message queues without having to build up a self-hosting infrastructure for it first.

Feasibility study with open web service

Start with a practical feasibility study in the form of a proof-of-concept by taking a part of the workloads that your current web service handles and place it on a SmoothMQ open service. Develop an adapter that consumes messages from your current web service queue and place it on a SmoothMQ instance in Eyevinn Open Source Cloud. This adapter can be deployed and running as a Web Runner. Then have some of your workers to consume work from the SmoothMQ message queue. This gives you the opportunity to validate, identify gaps and estimate the effort to make the move without having to invest time and money in building up your own infrastructure first.

By now you should have the necessary information to scope and initiate a transition project.

The transition project can use the same approach by off-loading some of the workloads from your current web service and shift it over to open web services. Gradually make the move under the comfort that you can always fall back to current web service if you discover any problems along the way. Also, you have not had to make any large up-front infrastructure investments before you know that everything will work.

Move from open web service to self-hosted infrastructure

When you are comfortable that the open source based solution works you can start the project to build up your own infrastructure for this. As the open web service is open source and not bound to either Eyevinn Open Source Cloud or its underlying infrastructure you can run the very same software in your infrastructure. What provider of cloud (or on-prem) infrastructure you choose is fully up to you.

No larger modifications to the open web service based solution would be necessary as it is the very same software running.

Stronger independence

Now you are in a stronger position as the video transcoding part of your media solution is not bound to one single vendor. To further strengthen your independent position, you can take on the next component of your solution where there might exist an open source equivalence and use the same approach.

Build your own platform for HLS live stream monitoring

Here is another full example of a solution built on open web services in Eyevinn Open Source Cloud. This example covers a solution for your own platform for HLS live stream monitoring. This solution consists of an HLS Stream Monitor that monitors one or many HLS live streams for errors. It provides an OpenMetrics endpoint that Prometheus can scrape and be visualized in Grafana. To manage what streams to monitor we have a database and a service to create or remove streams from the stream monitor.

Requires 3 available services in your plan. If you have no available services in your plan you can purchase each service individually or upgrade your plan.

HLS Stream Monitor

The open web service that is responsible for monitoring the live streams is the HLS Stream Monitor. It provides an API to manage running monitors and a monitor can check one or many HLS streams. It also provides an OpenMetrics endpoint that can be scraped by metrics collectors such as Prometheus.

To enable access to the monitor instance outside of Eyevinn Open Source Cloud we launch a Basic Auth adapter running in a Web Runner. This provides a Basic Auth authentication to access the instance and metrics endpoint.

Stream Monitor Manager

To manage what streams to monitor and controlling the HLS Stream Monitor we have an application running in a Web Runner that reads the list of streams to monitor from a CouchDB NoSQL database.

Start building here

Build your own platform for virtual channels

Here is a full example project to get you started with building your own platform for virtual channels based on open web services in Eyevinn Open Source Cloud. This solution consists of a virtual channel playout and a simple web application fetching configuration from an application configuration service.

Requires 5 available services in your plan. If you have no available services in your plan you can purchase each service individually or upgrade your plan.

Virtual Channel Playout

The virtual channel playout is built with the open web services:

  • FAST Channel Engine generating and providing the player with the live streaming manifest.
  • Web Runner to provide the webhook that the engine calls to decide what to play next in the channel.
  • CouchDB for storing the database of assets and URLs to the VOD streaming packages.

Web Video Application

The web video application is a NextJS based web application that reads the channel configuration from an application configuration service and provides the player to view the channel.

Start building here

Build your own Video Streaming Platform

Here is a full example project to get you started with building your own video streaming platform based on open web services in Eyevinn Open Source Cloud. This solution consists of a VOD preparation pipeline, orchestrator, database and a simple web application.

Requires 7 available services in your plan. If you have no available services in your plan you can purchase each service individually or upgrade your plan.

VOD Preparation Pipeline

The VOD preparation pipeline is built with the open web services:

  • SVT Encore for transcoding the source video file to a bundle of video files with different resolutions and qualities, often referred to as ABR transcoding.
  • Encore Packager to create a streaming package that is adapted for video delivery over HTTP
  • MinIO providing the storage buckets that is needed

Orchestrator

The orchestrator consumes events from the input bucket and creates a VOD preparation job when a new file is added. It is a NodeJS server application that we will develop and deploy in a Web Runner instance. The orchestrator will register in a database all files that have been processed.

Web Video Application

The web video application is a NextJS based web application that will fetch the available files from the database and enable playback using a web video player.

Start building here

Open Source Intercom Solution

In modern TV production, effective communication is crucial. Legacy intercom systems have often been complex, costly, and proprietary, limiting flexibility and scalability. The Open Source Intercom Solution, developed in cooperation with major Nordic broadcasters, leverages WebRTC technology to provide a modern, scalable, and easy-to-use alternative.

This guide will walk you through setting up and using the latest release of the Open Source Intercom Solution, ensuring seamless communication in your production environment. Whether you’re a seasoned broadcast professional or new to live production, this guide provides all the details you need to deploy and operate the system efficiently.

Key Features of the New Release

The latest update brings significant improvements, ensuring a more streamlined and user-friendly experience.

Some of the key enhancements include:

• A completely reworked and improved user interface, designed to be more intuitive and easier to navigate.

• Support for multiple calls in one browser window, enabling users to manage multiple communication lines more efficiently without switching between windows.

• Device switching during a call, allowing seamless transitions between different audio input and output devices.

• Volume control visibility on supported devices, ensuring that users can adjust audio levels more precisely.

• New participant muting functionality, allowing users to mute other participants when necessary to maintain clear communication.

• Audio functionality, where program output audio is automatically lowered when another participant speaks on a different line, enhancing clarity in live production settings.

• Numerous bug fixes and general performance improvements, ensuring a smoother, more reliable experience.

These features collectively enhance usability, making the intercom solution a powerful tool for broadcast communication.

Prerequisites

Before getting started, ensure you have the following in place:

• An Open Source Cloud account (https://app.osaas.io/), which provides access to the hosted intercom service.

• An Intercom service instance created, enabling communication channels to be established.

If you don’t have an account yet, signing up on https://app.osaas.io/ is quick and simple.

Logging In & Opening the Intercom App

To begin using the Open Source Intercom Solution, follow these steps:

• Log into https://app.osaas.io/.

• Navigate to the Intercom service in your dashboard.

• Click the three dots next to your running service and select Open Application to launch the intercom interface.

Once opened, the system is ready for production setup and communication.

Setting Up a Production

To configure a new production, follow these steps:

• In the Create Production section, enter a Production Name.

• Specify a Line Name (both fields are mandatory for creating a new production).

• If required, click Add Line to include multiple communication lines for different teams.

• Click Create Production to generate a unique Production ID.

Your new production will now be added to the list of recent productions.

To share the production with others, join the production and copy the URL from your browser’s address bar.

Joining a Production

Once a production is set up, users can join by following these steps:

• In the Join Production section, enter your Production ID and your Name.

• Choose the appropriate Microphone and Speaker settings.

• If your desired device does not appear, refresh the browser page.

• iPhone users may need to reload the page to detect the correct microphone and speaker.

• Select the Production Line you wish to join and click Join.

After joining, you will have full access to the intercom communication features.

Using the Intercom System

Once inside the system, users can leverage the following controls:

• Mute/unmute audio, allowing participants to manage their audio input effectively.

• Mute/unmute microphone, giving users control over their speaking permissions.

• Push-to-Talk (PTT) button, which activates the microphone only while pressed, ensuring clear and concise communication.

• Hotkeys support, allowing integration with external hardware like StreamDeck for more streamlined operations.

These features ensure a seamless and efficient communication experience for production teams.

Why Choose the Open Source Intercom Solution?

    A Flexible, Open-Source Alternative to Proprietary Systems

Traditional intercom systems often come with high costs, proprietary limitations, and complex infrastructure requirements. The Open Source Intercom Solution eliminates these barriers, offering a more affordable, adaptable, and high-performance alternative.

    Designed for Simplicity & Scalability

• Easy deployment: The system is available as a containerized package, meaning it can be deployed on standard IT infrastructure or hosted remotely in the cloud.

• Compatible with any retail hardware, so you’re not tied to a specific vendor’s equipment.

• Quick setup, enabling users to configure productions in a matter of seconds.

• High-fidelity audio powered by the Opus codec, ensuring professional-grade sound quality.

    Powered by WebRTC for Real-Time Communication

The system leverages WebRTC technology, which is widely adopted for real-time communication applications, offering low latency and cross-platform support. The architecture consists of:

• Symphony Media Bridge (SMB) – The core Selective Forwarding Unit (SFU) that routes media streams efficiently between participants.

• Intercom Manager – Handles backend communication control and session management.

• Intercom Frontend Manager – Provides an intuitive, web-based user interface

By using modern, web-based technologies, this solution ensures reliable performance, ease of use, and broad compatibility with existing workflows.

Conclusion

The Open Source Intercom Solution is a forward-thinking approach to broadcast communication. With its user-friendly design, open-source flexibility, and superior audio performance, it offers a scalable and cost-effective alternative to traditional intercom systems. Whether you’re a broadcaster looking to modernize your communication infrastructure or a production team in need of an intuitive and efficient intercom system, this solution provides a seamless way to stay connected.

Get started today by logging in to https://app.osaas.io/ and exploring the possibilities of modern, open-source intercom technology!

Instantly share your web application

In web development prototyping and the ability to get early feedback from users can be a critical factor for success. While in theory this is a good practice it can in reality be cumbersome to achieve. You have a running web application locally on your computer but to be able to share it with stakeholders for feedback it most often requires a lot of infrastructure work first. Time and effort you don’t want to spend just to show something that is work in progress.

This is the problem that the open web service Web Runner in Eyevinn Open Source Cloud addresses. In this blog post we will demonstrate how you with this service can take your web application and make it available online in minutes.

Why Open Source Cloud for web development prototyping?

Using an open web service based on open source you are not locked in with a specific vendor and you have the option to run the very same code in your own infrastructure or cloud.

Create an account for free at app.osaas.io and create your tenant. If you already have access to Eyevinn Open Source Cloud you can skip this step.

Step 1: Create a Hello World web application

If you already have a web application you can skip this step.

We will create a web application using the NEXT.js framework in this example.

Create a new Next.js project by running this command

% npx create-next-app@latest

After the prompts, create-next-app will create a folder with your project name and install the required dependencies. We can try it out by running it.

% npm run build
% npm start

Now you have your web application available on http://localhost:3000

Step 2: Create a repository on GitHub

Login or signup on GitHub and create a repository for your web application and give it a name, for example web-hello-world. It can be a private or a public repository. Push the code in the project folder, for example.

% git remote add origin git@github.com:birme/web-hello-world.git
% git branch -M main
% git push -u origin main

Step 3: Create a GitHub personal access token

Follow the steps described in the GitHub documentation for how to create a personal access token. You can skip this step if you already have a personal access token.

Step 4: Store token as a Service Secret

Now navigate to the Web Runner service in Eyevinn Open Source Cloud web console. Click on the tab “Service Secrets” and click on the button “New Secret”. Give the secret a name and paste the GitHub token from your clipboard.

Step 5: Create Web Runner

To make the web application available online you create a Web Runner instance.

Click on the tab “My web-runners” and then on the button “Create web-runner”. Enter the GitHub URL for your web application code and enter a reference to the secret you created in step 4. Enter the URL to the GitHub repository that you created in step 2.

Press create and you should now after a few minutes have an instance of your web application ready.

Give it a minute or two and then click on the instance card and it will open up a new window or tab to your web application. If you see an error the web application build process is still ongoing and you just might need to wait another minute or so.

You can now share the URL to this instance to stakeholders or users you want feedback from. In this example the URL is https://eyevinnlab-blog.eyevinn-web-runner.auto.prod.osaas.io

And as you see it is all running over HTTPS with a valid certificate.

To un-publish this web application you simply remove the instance that is running.

Extras

Using the GitHub action for OSC you can add this process to a build pipeline. For example automatically create a Web Runner instance of your application when a branch is updated for example.

Create a web runner instance of your web application in integration and end-to-end tests.

Conclusion

We have now given an example of how you quickly can share a prototype or a work-in-progress web application using an open web service in Eyevinn Open Source Cloud. As this is an open web service in Eyevinn Open Source Cloud you always have the option to run the same solution on your own premises as it is based on open source

If you want to try this out you can sign up and launch one instance all for free.

Hosting a static website

This blog describes how to host a static website using open web services in Eyevinn Open Source Cloud. The static website is hosted on a bucket provided by the MinIO storage service based on open source.

Why Open Source Cloud for hosting a static website?

Using an open web service based on open source you are not locked in with a specific vendor and you have the option to run the very same code in your own infrastructure or cloud.

Create an account for free at app.osaas.io and create your tenant. If you already have access to Eyevinn Open Source Cloud you can skip this step.

Step 1: Create a Hello World application

In this tutorial we will start by creating a Hello World application in React. We assume that you have NodeJS installed. Create a new React application by running the following command.

% npx create-react-app hello-world
Need to install the following packages:
create-react-app@5.0.1
Ok to proceed? (y) y

This will create a new directory called “hello-world” containing all the necessary files and dependencies for your React application.

Move into your new app’s directory:

% cd hello-world

Start the development server by running the following command.

% npm start

Your browser will open with this sample application.

Step 2: Build the application

Build the application to generate the static website. As it will be deployed to a subfolder we need to provide that information when building the website. We do that by setting the environment variable PUBLIC_URL when we build the application.

% PUBLIC_URL=/hello/ npm run build

The generated website is available in the folder called “build” in your directory.

Step 3: Deploy the build

Go to the web console of Eyevinn Open Source Cloud and obtain the access token available under Settings. Copy the token and store it in the environment variable OSC_ACCESS_TOKEN.

% export OSC_ACCESS_TOKEN=YOUR_TOKEN

Now run the following command to deploy the build. We name the website “hello” which will be the subfolder where the files are placed.

% npx @osaas/cli@latest web publish -s hello build/
Website published at: https://eyevinnlab-hello.minio-minio.auto.prod.osaas.io/hello/index.html
CDN settings:
 - Origin: eyevinnlab-hello.minio-minio.auto.prod.osaas.io
 - Origin Headers: 'Host: eyevinnlab-hello.minio-minio.auto.prod.osaas.io'
 - Origin Path: hello
 - Default root object: index.html

In this example the website is published and available at https://eyevinnlab-hello.minio-minio.auto.prod.osaas.io/hello/index.html

Step 4: Configure CDN

For performance and security you want to use a CDN provider for the delivery of the website. Now we also have the option to skip the subfolder and place the website in the root of the domain. To prepare for that we will rebuild the application without the PUBLIC_URL environment variable set.

% npm run build
% npx @osaas/cli@latest web publish -s hello build/

The files will be uploaded to the folder “hello” so when we configure the CDN we need to set that path in the request to the origin. When you setup your distribution property at your CDN provider you will then use the following in this example:

  • Origin: eyevinnlab-hello.minio-minio.auto.prod.osaas.io
  • Protocol: HTTPS
  • Port: 443
  • Origin Path: /hello/
  • Origin Host Headers: eyevinnlab-hello.minio-minio.auto.prod.osaas.io
  • Default root object: index.html

Now the website will be available at https:///index.html

Conclusion

With the open web service providing static website hosting in Eyevinn Open Source Cloud you always have the option to run the same solution on your own premises as it is based on open source. You can create one website including 50 GB storage for free to try this out.

MinIO Storage as VOD Origin

As a continuation to previous blog where we described how to get started with MinIO storage in Eyevinn Open Source Cloud we will in this blog walk you through how you can use it as an origin for Video On-Demand distribution.

Why Open Source Cloud as VOD Origin?

Using an open web service based on open source you are not locked in with a specific vendor and you have the option to run the very same code in your own infrastructure or cloud.

We will not cover how to create video on demand files in this blog post as it is covered in detail in the Eyevinn Open Source Cloud documentation.

Create an account for free at app.osaas.io and create your tenant. If you already have access to Eyevinn Open Source Cloud you can skip this step.

Step 1: Create a MinIO bucket

Start by creating a MinIO bucket in Eyevinn Open Source Cloud by following the instructions in the documentation. By following this guide you should now have a bucket called “tutorial”.

Step 2: Enable public access to bucket

For a video player to be able to download the Video On-Demand files we need to enable public read-only access for the bucket. If you followed the guide you will have en alias to your MinIO server instance called “guide” and using the MinIO command line tool you enable public access with the following command.

% mc anonymous set download guide/tutorial

Step 3: Upload VOD packages to bucket

Now let us upload VOD packages to this bucket. There are several options available here:

  • Setup a VOD creation pipeline in Eyevinn Open Source Cloud to create a VOD package from a video file.
  • Upload existing VOD packages on your computer to this bucket.
  • Migrate VOD packages from another origin using the HLS Copy to S3 service in Eyevinn Open Source Cloud.

In this walk-through we will use the “HLS Copy to S3” service to copy an HLS package we have available online to the bucket you created.

Navigate to the HLS Copy to S3 service and click on the button “Create Job”. Enter the following in the job creation dialog.

  • Name: guide
  • CmdLineArgs: https://maitv-vod.lab.eyevinn.technology/VINN.mp4/master.m3u8 s3://tutorial/
  • DestAccessKey: root
  • DestSecretKey: abC12345678
  • DestEndpoint: (MinIO server endpoint)

Press “Create” and wait for the job to complete.

Let us now verify that all files ended up in our bucket. We can use the MinIO command line tool or the AWS S3 client.

% mc ls guide/tutorial/VINN.mp4/
[2025-01-15 13:51:33 CET]   351B STANDARD master.m3u8
[2025-01-15 13:51:58 CET]     0B 1000/
[2025-01-15 13:51:58 CET]     0B 2000/
[2025-01-15 13:51:58 CET]     0B 600/

Step 4: Verify VOD package

We can now verify that the VOD package can be played. Open a web browser and go to our online web player at https://web.player.eyevinn.technology/ and enter the URL to index file, in our example it is https://demo-guide.minio-minio.auto.prod.osaas.io/tutorial/VINN.mp4/master.m3u8

Step 5: Configure CDN

To be able to handle the distribution of these VOD files you need to setup a CDN that your users go through to access the files. Pointing your users directly to the origin is not recommended as it is designed to handle large scales of request. For performance and security you will use a CDN provider for the delivery.
When you setup your distribution property at your CDN provider you will use the following:

  • Origin: Your MinIO instance hostname, e.g. demo-guide.minio-minio.auto.prod.osaas.io
  • Protocol: HTTPS
  • Port: 443
  • Origin Host Header: e.g. demo-guide.minio-minio.auto.prod.osaas.io

Important here is that the Host header in the HTTPS request to the origin is the hostname of the MinIO storage instance and not the hostname in the viewer request. Consult your CDN provider documentation on how to configure this.

Conclusion

With the open web service providing origin functionality in Eyevinn Open Source Cloud you always have the option to run the same solution on your own premises as it is based on open source. You can create one MinIO instance including 50 GB storage for free to try this out.