Getting started with Ooblex.

Ooblex is prebuilt with a working application, open sourced for you to run, configure, and commit.
View on Github

As a container, Ooblex Ooblex is comprised of several services, including those for media and inference. While these services can be run independently, they can also be run together on a single server enviroment.

Ooblex was developed for use on a server with a solid internet connection, an Intel 28-core CPU, and 16-GB of RAM.

A domain name is required for a full installation as SSL is often required for WebRTC browser support. Point a valid domain (or subdomain) at your public server’s IP address.

It is also recommended that all ports be made available and open to the server, as port tunneling and WebRTC TURN servers are not currently supported.

We are assuming a working and fresh deployment of a Ubuntu server, logged in as the root user in this example. Start by entering the following commands, line by line, into the Ubuntu terminal.


cd ~
sudo apt-get update
sudo apt-get install git
git clone https://github.com/ooblex/ooblex/

cd ooblex
sudo chmod +x *.sh

sudo ./install_opencv.sh

We should now be building OpenCV, which can take 30 minutes to a few hours to install.


sudo ./install_nginx.sh

During the NGINX install, it will request information about a domain name to generate a free SSL certificate. As mentioned in the requirements above, you will need to have a domain name pointing at your server for this to work. Also, if asked, redirecting from HTTP to HTTPS is preferred.


sudo ./install_tensorflow.sh
sudo ./install_gstreamer.sh
sudo ./install_janus.sh
sudo ./install_webrtc.sh
sudo ./install_redis.sh

The dependencies for Ooblex are largely now installed as we have now installed several core components. You will still need to deploy a REDIS and RabbitMQ server of your own though. If you are deploying to the cloud, your cloud provider may offer these as hosted services.

Edit the config.py file so that it uses ‘your’ REDIS and RabbitMQ uri is used. Also, you will need enter the domain name used when configuring the SSL certificate earlier.

You can edit the config.py with the following command:


sudo nano config.py

Next, lets configure and copy the files in the HTML folder to your /var/www/html folder. You will need to MODIFY both the index.html file AND the Ooblex.v0.js file contained within the JS folder. You will need to update the domain names from api.ooblex.com to whatever your domain is.

You will also need to configure the following files:

decoder.py : You will likely want to modify the file so that the REDIS and AMPQ connection strings point to your own servers, rather the Ooblex’s demo servers.
mjpeg.py : requires the SSL certificate location to be properly configured

Next we need to configure Janus. Put these files in your JANUS CONFIG folder, and configure them accordingly.


cd ~/ooblex/janus_confs
sudo cp * /opt/janus/etc/janus

You will also need to modify the AMPQ / RabbitMQ configuration settings to point to the correct address. The config.py file settings are not compatible with Janus’ config file, so you will have to update them manually.

You will also need to point the config files to the correct SSL certificate.

The following files and lines need updating:


janus.cfg:67:cert_pem = /etc/letsencrypt/live/api.ooblex.com/fullchain.pem
janus.cfg:68:cert_key = /etc/letsencrypt/live/api.ooblex.com/privkey.pem
janus.transport.http.cfg:64:cert_pem = /etc/letsencrypt/live/api.ooblex.com/fullchain.pem
janus.transport.http.cfg:65:cert_key = /etc/letsencrypt/live/api.ooblex.com/privkey.pem
janus.transport.websockets.cfg:37:cert_pem = /etc/letsencrypt/live/api.ooblex.com/fullchain.pem
janus.transport.websockets.cfg:38:cert_key = /etc/letsencrypt/live/api.ooblex.com/privkey.pem

The tensorthread_shell.py is a template for your own AI scripts: it is what brain.py uses for its own demo code. It is quite accessible if you're familiar with Python and Tensorflow already. Working with IBM's Watson Studio, exporting a Python-based version of a trained model can be directly imported into this tensorthread_shell.py file for rapid deployment of a high performing, low-latency, serialized model-- virtually no coding needed.

brain.py, or tensorthread_shell.py for that matter, can be distributed and run across many servers. Just repeat the above setup steps but instead just run brain.py (or tensorthread_shell.py), without any of the other services loaded on that new server. The Tensor Threads will work on AI tasks from the main REDIS/RabbitMQ queue and accelerate the overall system's processing performance! It can easily also be run on Windows or deployed to virtually any system that supports Python and Tensorflow (or TensorRT).

Linked are pre-trained models that are too large to include on Github, but that are required for the included demo functionality of Ooblex. Included is a facial detection model and two face-swaping transformative models; they are ready to be downloaded and deployed to Ooblex's models folder.

Download