# What's Portalgraph?

日本語は[こちら](https://portalgraphvr.gitbook.io/portalgraphdokyumento)

Portalgraph is a Unity software development environment for enjoying VR spaces on projectors and PC screens. Because the image changes in response to the user's movements, you can experience a space that truly seems to exist inside the screen — allowing you to look into that space in 3D from any position: up, down, left, right, near, or far.

<figure><img src="/files/2PTQKc5mW3hRO0q9H87G" alt=""><figcaption><p>Portalgraph application</p></figcaption></figure>

In addition to standard projector screens, Portalgraph can also display VR spaces projected onto a tabletop, across a box constructed from multiple displays arranged together, on ceilings, and more.

<figure><img src="/files/7QrDH8wZf4cle6eEhY5c" alt=""><figcaption><p>Desktop Portalgraph</p></figcaption></figure>

<figure><img src="/files/3vZn6uGRqMlI1X10snor" alt=""><figcaption><p>Portalgraph with a Box Made from Two Combined LCD Displays</p></figcaption></figure>

Developers can easily build such demos using Unity.

<figure><img src="/files/cit2qZZI0g0R91XJZpzv" alt=""><figcaption></figcaption></figure>

The communication protocol between the tracking application and the Portalgraph core is publicly available, so you can implement your own custom tracking application if desired.

#### Get Started Here

For those who want to experience Portalgraph

{% content-ref url="/pages/yQPTkFAKLWG0b1zvO5Md" %}
[Turn Your PC Screen into a VR Space](/portalgraph-document-en/quick-start/turn-your-pc-screen-into-a-vr-space.md)
{% endcontent-ref %}

{% content-ref url="/pages/6im6Fj5tYylVplatyfDo" %}
[Turn a 3D Projector into a VR Space](/portalgraph-document-en/quick-start/turn-a-3d-projector-into-a-vr-space.md)
{% endcontent-ref %}

For those who want to build a Portalgraph app

{% content-ref url="/pages/UfCdJYKffBRMRIAw6qmM" %}
[Portalgraph App Creation Tutorial](/portalgraph-document-en/quick-start-developper/portalgraph-app-creation-tutorial.md)
{% endcontent-ref %}

Portalgraph detects the user's viewpoint in real time and generates 3D imagery rendered to that viewpoint. This makes it appear as though a space exists inside the screen, which you can peer into. Viewpoint detection is performed either via face recognition through a webcam or by attaching a VIVE Tracker to the head. It supports left-right split, top-bottom split, anaglyph, and dual left-right output modes, making it compatible with 3D projectors, 3D televisions, anaglyph glasses, and other 3D display devices.

The Portalgraph runtime is implemented as software independent from the Portalgraph application itself, tracking the user's viewpoint and sending coordinates to the application via OSC. It currently supports face tracking via webcam and tracking via VIVE Tracker.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://portalgraphvr.gitbook.io/portalgraph-document-en/whats-portalgraph.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
