Post comments and corrections to in a public category or direct message @jcasman or @craig on the lists system. You must be logged in to send a message.

This is an unofficial, community-generated guide for developing applications that use 360 media from RICOH THETA cameras. This is not authorized by RICOH and is based on publicly available information.

Get the latest news and updates on Twitter @theta360dev

1. Resources

1.2. Unofficial Information

2. 360 Video Formats

Video is stored on the camera in dual fish-eye mode.

The dual-fisheye MP4 file has an image size of 1920x1080 or 1280x720.

The official RICOH desktop app will convert this to equirectangular mode. By default, the program will take your file name and add _er to it. The file is saved in the same directory that the original video file was in.

Both the input and output are .mp4 format videos.

dual fish eye
Figure 1. dual fish eye video format

The program will convert it to equirectangular. Note that there is a minor color shading difference on the right side of the screen grab. The camera has two lenses. Each lens captures and saves one sphere. The software on the computer stitches the image together. The RICOH app adds the meta video meta data in for you.

if you use something like Adobe Premiere Pro to edit your videos, you will need to inject the meta data again
equirectangular video
Figure 2. equirectangular video format

You can now load the video up to YouTube to your normal channel. YouTube will grab the meta data and add the 360 degree video controls automatically. This is cool!

youtube 360
Figure 3. YouTube 360 video
I’ve successfully tested the YouTube viewer on Windows and Mac with different browsers and on Linux with FireFox. It does not appear to work on Linux with Chrome.

3. Video Live Streaming

The THETA S can stream video when connected to a USB or HDMI cable. The output is in dual fish-eye mode (two spheres). To make the video usable in a web browser or with external services, you’ll need to convert the video into equirectangular mode.

To view a live streaming video with 360 degree navigation, you can connect it to a YouTube live streaming event.

YouTube live streaming needs to be an Event and not Stream now. You must check This live stream is 360 under the Advanced Settings tab.
Table 1. Table THETA S Live Streaming
Type Format Camera Mode Size Frame Rate Connection

Live View

Equirectangular in MotionJPEG

Image Only




USB Live Streaming of dual-fisheye

Dual-fisheye in MotionJPEG

live streaming



USB isochronous transfer

HDMI live streaming of dual-fisheye

Dual-fisheye in uncompressed YCbCr

live streaming

1920x1080, 1280x720, 720x480



USB live streaming of equirectangular

Equirectangular in MotionJPEG

live streaming




There are pros and cons of each method.

Table 2. Table of Pros and Cons for each Live Streaming Method
Live Stream Type Pro Con Community Implementation

USB equirectangular

easy. appears as webcam to your application

Low resolution (720). Slow framerate (15fps)

Looking forward to seeing this after release! Hope they output to HDMI

HDMI conversion to equirectangular

highest resolution and framerate. Most common input for broadcast equipment

difficult. no packaged solution. people are building their own.

most people using Unity and then connecting to third-party equipment for live event broadcasting

USB dual-fisheye

easy. appears as webcam. higher resolution than equirectangular. less load on CPU.

It’s in dual-fisheye, so you’ll need to build an application to process it

Experimenting with facial and object recognition, suveillance

WiFi Live View

Works over WiFi. equirectangular.

low resolution. slow. not designed for live streaming, just preview

preview scenes

Unless you are using camera._getLivePreview in image mode to display a low-resolution live view with low frame rate, the first step is to get the camera into live streaming mode

thetas livestreaming
Figure 4. Set Live streaming video mode on RICOH THETA S

3.1. For USB

  1. Press “mode button” - keep pressing - and press “power button” → camera goes to the LiveVideoStreamig mode.

  2. Connect a usb cable with S and laptop (MAC or PC).

  3. The THETA S can be used as a web cam. You can use web cam software such as Skype to see live video streaming with the THETA S.

usb live streaming
Figure 5. Live video streaming from RICOH THETA S to a computer monitor with USB 3

3.2. For HDMI

  1. Press “mode button” - keep pressing - and press “power button” → camera goes to LiveVideoStreamig mode.

  2. connect a hdmi cable with S and a monitor.

  3. S could be a output video device. The monitor shows the S’s live video streaming .

hdmi live streaming
Figure 6. Live video streaming from RICOH THETA S to a TV with HDMI

3.3. API Testing

{"name": "camera.getOptions",
    	"sessionId": "SID_0003",
    	"optionNames": [
Remember to set your sessionId correctly

The response:

  "name": "camera.getOptions",
  "state": "done",
      "iso": 0,
      "captureMode": "_liveStreaming"

3.4. RICOH Live Streaming Driver (THETA UVC Blender) with Equirectangular Output

Figure 7. USB live streaming with equirectangular

The USB driver appears as a web cam to applications running a Mac or PC. In the example above, the equirectangular video is shown streaming in QuickTime. A video clip that shows the output of the THETA S USB live streaming is available on YouTube.

You must install the live streaming driver, THETA UVC Blender. Download it from

driver download
Figure 8. Download the official RICOH live streaming driver

This walkthrough is for Windows 10 64 bit.

Right click on the UVCBlender_setup64_en.exe icon. Run as administrator

uvc blender setup icon
Figure 9. Right click to run setup as admin
setup run as admin
Figure 10. Make sure you run it as administrator

Run through the install wizard.

install shield
Figure 11. Follow onscreen instructions
install shield complete
Figure 12. InstallShield Wizard Completes

After installation, you will need to connect a THETA that is powered off to register the device. You may need to reboot. In my tests, I could not advance to the UVC register step without a reboot. If I try and register the device without rebooting, I see a flashing Establishing a Connection dialog, but the connection is never completed.

Figure 13. THETA UVC Register

If you need to reboot, run the THETA UVC Register application as administrator.

theta uvc register
Figure 14. Run THETA UVC Register as administrator

After a reboot and starting THETA UVC Register as administrator, I then plug the THETA in. The THETA is turned off. The connection is established.

register button
Figure 15. If you don’t see the register button, go back a few steps

The registration completes.

registration complete
Figure 16. Make sure the registration is successful

At this stage, you may need to reboot again. If you do, it is a one-time requirement.

Now, press power and mode to start the THETA S in live streaming mode. Test it with a common video streaming application such as Google Hangout. I have heard of people having problems displaying the stream. Start your application first with the camera unplugged. Once the application is running, select THETA UVC Blender, then start your THETA in live streaming mode, and plug it in. There are probably other ways to get the application to recognize the live stream, but this sequence consistently works for me.

Figure 17. Test with Google Hangout

Go into the settings of Google Hangout to select the THETA UVC Blender webcam.

select uvc blender
Figure 18. Select THETA UVC Blender, not THETA S
uvc blender selected
Figure 19. video should be in equirectangular
google hangout in use
Figure 20. THETA S in use as webcam

Here’s an example with Open Broadcaster Software.

Create a new Scene called THETA Test. Right click in Sources to Add a new Video Capture Device

obs video capture
Figure 21. Add THETA S as Video Capture Device to OBS

Under the Device Selection window, select THETA UVC Blender. Click OK.

obs video device selection
Figure 22. Select THETA UVC Blender as Device

The video stream will fill a portion of the screen. Select Edit Scene to size the video stream to fit.

obs base
Figure 23. Edit Scene to Fit the Stream
Figure 24. Live Streaming with Open Broadcaster Software

You can use OBS or other software to stream the image to YouTube. Refer to the blog post below for more information:

3.5. Example with Processor Language

Community Contribution from Sunao Hashimoto, kougaku on GitHub. Full sample source code is available.

theta s live viewer
Figure 25. Live viewer for THETA S

The example above is built with Processing.

Additional information is on his blog post in Japanese.

kougaku stitching english
Figure 26. dual fish-eye video stitching

3.6. Examples with 3D Tools such as Unity and Maya LT

Nora, @steroarts, released a shader pack to convert THETA 360 degree media into equirectangular format in real time.

The developer below, Goroman, was able to get reasonable 360 video live streaming in equirectangular mode after an hour of work back in September, 2015. Additional information in Japanese is here.

goro man
Figure 27. equirectangular video without stitching

The section below was translated from Japanese by Jesse Casman.

hecomi recently wrote an interesting blog post u sing Unity to view realtime 360 degree video streaming. I personally have very little experience with Unity, but the content and pictures were useful, so I translated the blog post for others to use. This is not an exact translation, but it should be much more clear than doing Google translate.

I noticed GOROman ([@GOROman]( tweeting about the info below, so I decided to try it myself right then and there.

@GOROman tweet: You should be able to easily build a live VR stream from this. Stitching might be an issue… For the time being, it might be fine to just connect the sphere to the UV texture.

The THETA S…​includes features like Dual-Eye Fisheye streaming over USB (1280x720 15fps) and HDMI streaming (1920x1080 30fps). In order to view this using Unity, I made a an appropriately UV developed sphere and a shader appropriate to AlphaBlend border. Ultimately, for the purpose of making a full sphere with the THETA S, it would be much higher quality and more convenient (can use Skybox too!) to use the fragment shader, made by Nora @Stereoarts), which directly writes Equirectangular onto a plane.

@Stereoarts tweet: I’ve released a Theta Shader Pack. A shader for converting THETA / THETA S full sphere video to Equirectangular in Unity and supporting scripts.

For this article, I wanted to jot down my techniques as well.


Example of taking a video with THETA

The THETA S gives beautifully separated spheres. The angle covered in one sphere is slightly larger than 180 degrees.

dual fish eye
Figure 28. dual fish eye image
dual fisheye tripod
Figure 29. dual fish eye image with tripod

In doing this, I’ve made an adjustment using sample texture, which Goroman filmed using WebCamTexture.

Making a sphere with good UV setting

Working with Maya LT, the UV comes out like this, if you make a normal sphere.

MayaLT UV mapping
Figure 30. Maya LT UV Mapping

It would look like below, if you make a plane with the UV.

MayaLT UV mapping 2
Figure 31. Maya LT UV Mapping 2

All it needs is to be cut in half and be moved appropriately.

MayaLT UV mapping 3
Figure 32. Maya LT UV Mapping 3
MayaLT UV mapping 4
Figure 33. Maya LT UV Mapping 4

It looks like this. (I did not adjust it, so it might be slightly off.)

crescent moon
Figure 34. Crescent Moon image

Actually, I wanted to use alphablend for the border, so I used 2 overlapping half spheres instead of one sphere. The UV border is adequately stretched manually.

Figure 35. Overlapping

Incidentally, surface is set to face inward, by reversing all normal vectors. UV position and size are fine to adjust later with shader.

Setting with Unity

Import the Maya LT built model with Unity, and put the camera in the center. Write a shader, so the model’s UV position can be adjusted or can alphablend. In order to control the drawing order and to prevent the border from changing at certain orientations, each half sphere has a different shader.

Shader "Theta/Sphere1" {
    Properties {
        _MainTex ("Base (RGB)", 2D) = "white" {}
        _AlphaBlendTex ("Alpha Blend (RGBA)", 2D) = "white" {}
        _OffsetU ("Offset U", Range(-0.5, 0.5)) = 0
        _OffsetV ("Offset V", Range(-0.5, 0.5)) = 0
        _ScaleU ("Scale U", Range(0.8, 1.2)) = 1
        _ScaleV ("Scale V", Range(0.8, 1.2)) = 1
        _ScaleCenterU ("Scale Center U", Range(0.0, 1.0)) = 0
        _ScaleCenterV ("Scale Center V", Range(0.0, 1.0)) = 0
    SubShader {
        Tags { "RenderType" = "Transparent" "Queue" = "Background" }
        Pass {
            Name "BASE"
Blend SrcAlpha OneMinusSrcAlpha
Lighting Off
ZWrite Off
#pragma vertex vert_img
#pragma fragment frag
#include "UnityCG.cginc"
uniform sampler2D _MainTex;
uniform sampler2D _AlphaBlendTex;
uniform float _OffsetU;
uniform float _OffsetV;
uniform float _ScaleU;
uniform float _ScaleV;
uniform float _ScaleCenterU;
uniform float _ScaleCenterV;
            float4 frag(v2f_img i) : COLOR {
                // 中心位置や大きさを微調整
                float2 uvCenter = float2(_ScaleCenterU, _ScaleCenterV);
                float2 uvOffset = float2(_OffsetU, _OffsetV);
                float2 uvScale = float2(_ScaleU, _ScaleV);
                float2 uv =  (i.uv - uvCenter) * uvScale + uvCenter + uvOffset;
                // アルファブレンド用のテクスチャを参照してアルファを調整
                float4 tex = tex2D(_MainTex, uv);
                tex.a *= pow(1.0 - tex2D(_AlphaBlendTex, i.uv).a, 2);
                return tex;

Here’s a second section of code.

Shader "Theta/Sphere2" {
    Properties {
        _MainTex ("Base (RGB)", 2D) = "white" {}
        _AlphaBlendTex ("Alpha Blend (RGBA)", 2D) = "white" {}
        _OffsetU ("Offset U", Range(-0.5, 0.5)) = 0
        _OffsetV ("Offset V", Range(-0.5, 0.5)) = 0
        _ScaleU ("Scale U", Range(0.8, 1.2)) = 1
        _ScaleV ("Scale V", Range(0.8, 1.2)) = 1
        _ScaleCenterU ("Scale Center U", Range(0.0, 1.0)) = 0
        _ScaleCenterV ("Scale Center V", Range(0.0, 1.0)) = 0
    SubShader {
        Tags { "RenderType" = "Transparent" "Queue" = "Background+1" }
        UsePass "Theta/Sphere1/BASE"

As below, for alphablend, have a texture made, that is alpha adjusted to UV. I made adjustment for perfectly fit, by exporting UV with postscript and reading with illustrator (white circle inside is alpha=1; around the circle, from inside to outside, changes from 1 to 0; outside will not be used so whatever fits.)

two circles
Figure 36. Two Circles

Then, adjust the parameters and you’ve got a whole sphere.

Figure 37. Parameters
sphere unity
Figure 38. Unity Sphere
realtime stitching
Figure 39. Realtime Stitching

Changing into Equirectangular

I tried it with a modified vertex shader.

Shader "Theta/Equirectangular1" {
    Properties {
        _MainTex ("Base (RGB)", 2D) = "white" {}
        _AlphaBlendTex ("Alpha Blend (RGBA)", 2D) = "white" {}
        _OffsetU ("Offset U", Range(-0.5, 0.5)) = 0
        _OffsetV ("Offset V", Range(-0.5, 0.5)) = 0
        _ScaleU ("Scale U", Range(0.8, 1.2)) = 1
        _ScaleV ("Scale V", Range(0.8, 1.2)) = 1
        _ScaleCenterU ("Scale Center U", Range(0.0, 1.0)) = 0
        _ScaleCenterV ("Scale Center V", Range(0.0, 1.0)) = 0
        _Aspect ("Aspect", Float) = 1.777777777
    SubShader {
        Tags { "RenderType" = "Transparent" "Queue" = "Background" }
        Pass {
            Name "BASE"
Blend SrcAlpha OneMinusSrcAlpha
Lighting Off
ZWrite Off
#pragma vertex vert
#pragma fragment frag
#define PI 3.1415925358979
#include "UnityCG.cginc"
uniform sampler2D _MainTex;
uniform sampler2D _AlphaBlendTex;
uniform float _OffsetU;
uniform float _OffsetV;
uniform float _ScaleU;
uniform float _ScaleV;
uniform float _ScaleCenterU;
uniform float _ScaleCenterV;
uniform float _Aspect;
struct v2f {
    float4 position : SV_POSITION;
    float2 uv       : TEXCOORD0;
v2f vert(appdata_base v) {
    float4 modelBase = mul(_Object2World, float4(0, 0, 0, 1));
    float4 modelVert = mul(_Object2World, v.vertex);
float x = modelVert.x;
float y = modelVert.y;
float z = modelVert.z;
float r = sqrt(x*x + y*y + z*z);
x /= 2 * r;
y /= 2 * r;
z /= 2 * r;
float latitude  = atan2(0.5, -y);
float longitude = atan2(x, z);
float ex = longitude / (2 * PI);
float ey = (latitude - PI / 2) / PI * 2;
float ez = 0;
ex *= _Aspect;
modelVert = float4(float3(ex, ey, ez) * 2 * r, 1);
    v2f o;
    o.position = mul(UNITY_MATRIX_VP, modelVert);
    o.uv       = MultiplyUV(UNITY_MATRIX_TEXTURE0, v.texcoord);
    return o;
            float4 frag(v2f i) : COLOR {
                float2 uvCenter = float2(_ScaleCenterU, _ScaleCenterV);
                float2 uvOffset = float2(_OffsetU, _OffsetV);
                float2 uvScale = float2(_ScaleU, _ScaleV);
                float2 uv =  (i.uv - uvCenter) * uvScale + uvCenter + uvOffset;
                float4 tex = tex2D(_MainTex, uv);
                tex.a *= pow(1.0 - tex2D(_AlphaBlendTex, i.uv).a, 2);
                return tex;

Here’s a second section of code.

Shader "Theta/Equirectangular2" {
    Properties {
        _MainTex ("Base (RGB)", 2D) = "white" {}
        _AlphaBlendTex ("Alpha Blend (RGBA)", 2D) = "white" {}
        _OffsetU ("Offset U", Range(-0.5, 0.5)) = 0
        _OffsetV ("Offset V", Range(-0.5, 0.5)) = 0
        _ScaleU ("Scale U", Range(0.8, 1.2)) = 1
        _ScaleV ("Scale V", Range(0.8, 1.2)) = 1
        _ScaleCenterU ("Scale Center U", Range(0.0, 1.0)) = 0
        _ScaleCenterV ("Scale Center V", Range(0.0, 1.0)) = 0
        _Aspect ("Aspect", Float) = 1.777777777
    SubShader {
        Tags { "RenderType" = "Transparent" "Queue" = "Background+1" }
        UsePass "Theta/Equirectangular1/BASE"
Figure 40. Results

When taking a look at the mesh, it moves around like this.

results mesh
Figure 41. Results Mesh

Because polygon did not fit, there is a blank space in the corner. This could have been avoided if we have used a direct fragment shader like Nora.


It looks like there’s the possibility of multiple fun topics here like spherical AR and Stabilization. After the THETA S goes on sale, I would love to play with it more.

Update March 9, 2016

Maya Planar UV Mapping Instruction

Thanks to @hecomi for providing this video that shows how to tweak the UV mapping. He produced it to help Flex van Geuns who was trying to use the UV mapping with webgl.

3.7. Community Articles About 360 Display in Unity

4. Image Viewers

The RICOH THETA S will will generate a equirectangular JPEG file of 5376x2688 or 2048x1024.

4.1. Example in Processor language

still image stitching english
Figure 42. still image stitching

4.2. Example in Javascript

akokubo javascript viewer
Figure 43. 360 degree still image in Chrome web browser

At the hackathon, we used the open source viewer from Akokubo listed above as part of our real estate demo built by a high school student.

There are other open source JavaScript viewers that we have not tested:

5. Graphic Tech

5.1. HDR

Simple HDR is a great application that works with the RICOH THETA S and m15 models. It is developed by Built Light

Nick Campbell produced a video that gives a good overview of Simple HDR and 360 HDRIs

Another HDR application is HDR 360 Bracket Pro for RICOH THETA by Brad Herman, the CTO and co-founder of SPACES.

6. Linux

RICOH only supports Mac and Windows desktop. As many developers use Linux, we’ve collected some information from the community to help people with basic tasks.

Linux can be used to control the camera HTTP API. There are also a number of scripts to get media from the camera.

Tips from the community:

  • YouTube 360 videos work with Firefox on Linux. Some people have had problems with FireFox

  • If you’re running Linux in VirtualBox as a guest, turn off 3D hardware acceleration

  • There are a large number of viewers at the site. I’ve been using FSPViewer.

If you want to use Linux to download media from the THETA and view it on your Linux box, you can use Wine for image viewing using the THETA Windows app or use a third-party application, Pano2VR.

Documentation below contributed by Svendus

SphericalViewer.exe opens and installs with Wine It runs and you can view Spherical images, but videos are not converted.

Linux users can also import the files and use Pano2VR5.


    sudo apt-get install --no-install-recommends gnome-panel
    sudo gnome-desktop-item-edit /usr/share/applications/ --create-new
new app
pano2vr 1
pano2vr 2
pano2vr 3
pano2vr 4

7. Proprietary Technical Information

7.1. Lens Parameter Information

The lens geometry for the THETA is based on equidistant projection. The final projection style for both videos and images is equirectangular projection. RICOH does not make detailed lens parameter information available. This is also known as lens distortion data. Developers often ask for this information to improve stitching. It is proprietary and not available as of December 2015. Stitching is still possible without this information.

7.2. Stitching Libraries

The RICOH THETA S processes still images inside of the camera. It takes 8 seconds for the camera to be ready to take another still image.

The videos are stored in dual-fisheye format (two spheres). The official RICOH applications will convert this into equirectagular format on either mobile devices or desktop computers. This format can then be viewed locally or uploaded to YouTube, Facebook, or other sites.

The source code and algorithms to perform this stitching are not available to developers.

As of December 2015, there is no way to use the RICOH libraries in live streaming mode.

8. FAQ for Media Editing

8.1. Q: How do I edit video in Adobe Premiere Pro?


  1. Download dual-fisheye media file to your computer

  2. Convert with official RICOH THETA desktop application. File name will end in _er

  3. Edit in Premiere Pro. Change audio track or add special effects with After Effects

  4. On your desktop computer inject metadata again using another tool.

  5. Upload to YouTube or other 360 degree video player or site

8.2. Q: How do I edit a still image in Photoshop or other image editing software?

Example of use: The photographer wants to lighten an image.

Here is the 'workflow':

  • Edit your JPEG in Photoshop

  • EXPORT as JPEG to a NEW FILE. Do not overwrite the original (heck back it up or something).

  • Run EXifToolGUI

  • Select the exported JPEG file

  • Select Menu option Modify/Remove Metadata

  • Select the top option '-remove ALL metadata' and click 'Execute' button

  • Make sure your exported JPEG file is still selected

  • Select Menu option 'Export/Import'/Copy metadata into JPG or TIFF.

  • In the File Dialog select the ORIGINAL panorama JPG file.

  • Make sure ALL options are selected and click 'Execute'

  • If you look at the Metadata tag with the ALL button clicked you should see both a section labeled 'Ricoh' AND one labeled 'XMP-GPano' (Googles XMP pano)

  • Test the exported JPG in the RICOH program. Hopefully it worked.

8.3. Q: How do I connect the camera as a USB storage device?

A: Hold the WiFi and Shutter buttons on your camera while you plug the camera into the USB port of your computer. The camera will appear as RICODCX. This is generally more of a problem on Macs. Make sure you turn off auto-import into Photos. People have experienced problems with importing the 360 images into Photos. Save them to disk and use the RICOH app.

8.4. Q: What are the technical specifications of images and video?

A: The official RICOH site has great information in the overview section.