
WHAT IS IT?
H-Lobox was born from the union of an
experimental application developed by Ondazerostudio, for interactive
stage/events visual installations, and of the showcases crafted in
Hackspace Catania laboratory.
Basically, it’s an holographic
“pepper’s ghost type” showcase, a box inside which objects appears with a
tridimensional effect.
In the most advanced version it’s possible to interact with the virtual
objects inside the showcase. And/or select them. By managing
contents throug an application, an holographic showcase will be more versatile,
customizable, and updatable with new contents. The application can also be
arranged to display in front-retro projection for mapping & visual shows on
stage.
THE OPTICAL WORKING PRINCIPLE
It’s based on a basic as well as simple law of light reflection,
according to which, a light ray hitting a planar surface, is reflected with an
opposite angle.
If this
surface is transparent, or better, semi-reflective, and in place of a single
light ray there is an image, this will appear as forming on a virtual plane
behind the surface, on a distance equal to the distance of the source from the
reflective surface.
If the
angle of incidence is exactly 45°, the holographic image will appear on
a virtual plane perpandicular to the source image. So, if the screen is
horizontal, facing down, the object will appear in a virtual plane perfectly
vertical for the observer. This plane intersects the screen in the same
intersection point of the reflective surface.
HISTORICAL BACKGROUND

More than 2 centuries later, in 1858, the english engineer Henry Dircks, recovering the
device of Dalla Porta, patented the “Dircksian
Phantasmagoria”,
a special theater installation that reproduced a room
inside which a ghost could appear during an opera.
According tho rumors, at that time, this
technique was used also by charlatans to juggle peasants and extract them money
in change of a “communication with the beyond service”. The intent of Dircks
was to inform people about this scams by explaining this effect through theater
shows.
By the way,
this installation was too expensive and meant the complete restructuration of
most theaters, and didn’t have a sequel.
In the
following decades, Dirck’s technique, as well as Pepper’s one, would be
inspiration for a lot of analogical special effects for cinema, and for
enterteinment park attractions. In modern times, with the coming of digital
projectors and monitors, this technique is getting back to popularity in a lot
of different formats and purposes.
HOW
IS IT DONE?

This scheme refers to a showcase made
for a semi-permanent installation. It’s made by a stand, inside which a laptop
or a bluray player is placed. There is a slot to place human tracking
devices, like kinect or Leap Motion. On the table it’s possible to place real
scenographic objects. Over the plane there is the monitor stand, a
protective frame, the monitor and the covering grid.



In case
of interactive versions, the monitor receives the signal from a hdmi or vga
cable connected to a computer.
SOFTWARE INTERFACE
The software that manages
the interactive and non interactive contents, and their position on the screen
is made entirely on Unity 3D enting. Development started in Jan 2015, with the
purpose of creating an application to manage multiple interactive contents for
VJ ing and Videomapping softwares.
The app has been
developed initially on Mac and improved each time with new functionalities.
Version 1) Different 3D
objects animated and rendered in real time could be displayed and changed by
numeric keys on a computer.
Versione 2) Introduced
3-4 camera real time system for olographic pyramids. Each camera has a Syphon
server attached so each side of the pyramid can be mapped separately on a
layer..
Versione 3) cameras can
be selected separately and repositioned directly by the application interface
with keys. Mapping software is not necessary any more for camera mapping.
Versione 4) Introduced
Characters with real time MOCAP by Kinect. Virtual Characters displayed inside
the box follow user’s gestures in real time. Kinect data acquisition is
obtained through OpenNi libraries and NiTe Middleware. Kinect is then
interfaced with Unity through ZigFu or NiMate plugins. Both are composed by an
indipendent bundle, running on OS, and a plugin inside the application to
retrieve tracking data. The pairing between tracked joints and virtual
character joint is made manually during development phase.
Versione 5) Introduced
Augmented Reality elements. User is tracked in the real space and reproduced
scaled in virtual space. Here the user can interact with virtual objects displayed
inside the box. Positon, distance, force fields, colliders are used to trigger
animations and dynamic effects. Finally, real small sized objects can be
reproduced (as well as 3D scanned) and imported in the virtual space to realize
interactions between virtual objects and real objects inside the box.
This installation was
3-sided type, with assisted interactivity and application control developed for
Mac only.
Different objects are
displayed, some pre-animated, some with different interactivities. The objects
are selectable by pressing numeric keys. It’s also possible to activate a timer
for auto-switching contents.
One after One, or on
command it will appear different objects
like:
An animated Hackspace
logo, a robot which copies gestures from the user, a dragon that changes pose
when someone approaches, some floating marbles attracted by the user’s hands, a
mechanical eye following the closer user changing color depending on distance,
a real miniaturized jar (to put in on purpose) with particles bouncing on it.

PROGETTO H-LOBOX per CMC 2016 – Domina Coral Bay
Interface is very minimal. When started, a stickman is
displayed with a male and female symbol on its sides. The User as soon as
tracked sees the stickman copying his gestures and will bring his hand to touch
one of the symbol.



(note for developers) version for Windows
1)
Tracking quality with Microsoft SDK is cleaner
and smoother than with Mac OSX libraries. Moreover, the plugin Zigfu doesn’t
work properly under Windows 10.
2)
The need
to install multiple workstations on a limited budget.
Development under Windows required a
dramatic reconstruction of the application architecture. That’s becaus of
Kinect SDK for Windows was crashing when scene changing or character disabling,
even using the “don’t destroy” function.
WORKFLOW
1 – PROJECT : planning of
physical structure of the H-Lobox (modello 3D).
2 – FLOWCHART: Determine the
application architecture, number of scenes, scene switch triggers, all
interactive and decorative elements to be imported in the scene, interactivity
tools.
3 – MODELING : making of 3D
models of virtual characters (modeling, texturing & materials, rig &
skinning, pose & test)
4 – CRAFTING: making of the
physical structure (cutting, assembly, decoration)
5 – VIRTUAL SCENOGRAPHY :
virtual scene compositing and importing of virtual characters.
6 – CODING: code compiling and
installation of 3rd part plugins.
7 – DEVELOPMENT: application
build, code&game objects pairing, game manager compiling, basic
functionality tests.
8 – HARDWARE CONFIGURATION: Computer
configuration and installation of drivers for Kinect/Leap Motion and
application.
9 – DELIVERING: hardware integration, final decorations, and delivery/installation on site.
1)
BOX PLANNING

-
Semi-reflective
surface must be inclinated 45°.
-
The box height
(h) must be slightly less than the monitor height (real display size).
-
The plane (on
which it’s possible to place real objects) must be ¼ - 1/3 deeper than monitor
height.
-
Both Kinect and
Leap Motion should be placed at 80cm – 1m above the ground.
-
Reflective
surface must be clean, without scratches, and kept perfectly plane by using
side brackets and horizontal crossbar on top. Because too much thickness must
be avoided, to avoid double reflections, both glass and synthetics tend to bend
with bad image deformations.
2)
FLOWCHART
All the application structure must be
planned according to the predicted usage of the installation.
There are basically 4 :
-
Passive
semi-permanent (external dvd/media
player linked to box with hdmi cable) .
-
Passive
portable/compact (contents are
displayed directly on a smartphone/tablet screen) .
-
Assisted
Interactive (interactive contents
managed by operator and mixed on VJ software with other contents).
- Permanent Interactive (automatized starting and shutting off, limited
contents).
1)
MODELING
Characters
models can be created with Maya as well as Blender, C4D, 3DS, Lightwave, etc…
Cheaper
way is modeling all in Hard Surface technique, and keep polygons number below
15.000
Hair
models for dynamic simulation, should
exported as a separate skin mesh, for simulation in Unity.
The
extension used for export was FBX with embedded media, skeleton definitions,
animations.
4)
CRAFTING
Materials
can be different. Smaller models can be made all plexiglas, including a thin
reflective surface (0,5mm). Glass has to be avoided for any mobile or not
permanent installation. Big sized showcases should be made of wood/aluminium
bars
As
coating TNT is cheap and fast to place and resize. Plywood is
more solid. Better material is Forex, better but expensive.
The
reflective surface in big-sized cases must be reinforced with brackets and a
crossbar on top. In giant formats better solution is a frame with a pvc or
holographic foil put to tension.
5) VIRTUAL SCENOGRAPHY

6) CODING
In this phase all codes required for game objects
behaviours are written. Language used in Unity are C# and Javascript.
7) DEVELOPMENT
Codes are paired to game objects, and the game manager
script is compiled to manage all the application.
8) HARDWARE CONFIGURATION
For interactive versions, a PC is configured with an
OS (Unity can build for Windows, Mac Osx, Linux). An Arduino processor can be
configured too, expecially for automatized models with MIDI customized
controllers.
9) ASSEMBLY & DELIVERY
If all tests run are ok, all the hardware is placed
inside the box, and the device is ready for transport.
BIgger installations must be mounted on place.
CASE HISTORY
AND FUTURE IMPLEMENTATIONS
Actually, the installation has been exposed in the
assisted interactive version, with pyramidal layout, at the Hackspace stand, Etnacomics
2015, and in a customized version, for the event CMC 2016, at Domina Coral Bay
Hotel, by MCA events. In this case showcases of different sizes (46” to 50”)
were realized both in interactive semi-assisted version and passive.
In all the events, the installation attracted the
attendance, confirming the effectiveness of the holographic system itself, as
well as paired to an application for contents control and interactivity (which
anyway stands alone). The context brought out also the limits of this kind of
installations. In particular, tracking with kinect, expecially for Mocap and
BodyMapping functions, is limited by the need of a wide free space and by the
presence of the only user in front of the box, and at a minimum distance of 3m:
technically impossible in a expo/showroom context.
Animations driven by people passing-by, their approach
to the showcase, or by their generic position ,are more effective and less
affected by kinect limits.
MOSAICO MIRAGE TESTING ON PEPPER'S GHOST PYRAMID
MOSAICO MIRAGE TESTING ON PEPPER'S GHOST PYRAMID
__________________________________________________________
MOSAICO 4.0 MIRAGE - Holographic Body Reactive Installation Prototype. from Ondazerostudio on Vimeo.
__________________________________________________________
The development of the software part will continue
with hand and face tracking, focused on a small-scale interaction.
Mocap will be continued on different kinds of
installations, like back-projections, interactive screens, holographic stages,
bodymapping ballet/dance shows.
No comments:
Post a Comment