Computer Graphics and Multimedia Software - Knowledge Sharing

Monday, November 20, 2017

openframeworks basic tutorial LeapMotion gesture recognition

The following repository version could be built without problems or you could also receive directly from openframeworks addon.
First of all, as a Hello World-like sample, try to recognize and display the hand. Only the fingertip could be detected independently! Easy!
There are two ways available for ofxLeapMotion. The first method is to use a simple hand model called ofxLeapMotionSimpleHand defined in ofxLeapMotion. The second method is to access and display functions provided by LeapMotion's SDK.
First of all, I try to draw by using a simple ofxLeapMotionSimpleHand. The way of doing it is very simple, once you have recognized ofxLeapMotionSimpleHand, calling a method called debugDraw () on that instance will display a simple 3D model of the hand.

testApp.cpp

#include "testApp.h"

void testApp::setup(){
ofSetFrameRate(60);
ofSetVerticalSync(true);
ofBackground(31);

ofEnableLighting();
light.setPosition(200, 300, 50);
light.enable();
cam.setOrientation(ofPoint(-20, 0, 0));

glEnable(GL_DEPTH_TEST);
glEnable(GL_NORMALIZE);
leap.open();
}

void testApp::update(){

simpleHands = leap.getSimpleHands();

if( leap.isFrameNew() && simpleHands.size() ){
fingerPos.clear();

leap.setMappingX(-230, 230, -ofGetWidth()/2, ofGetWidth()/2);
leap.setMappingY(90, 490, -ofGetHeight()/2, ofGetHeight()/2);
leap.setMappingZ(-150, 150, -200, 200);

for(int i = 0; i < simpleHands.size(); i++){
for(int j = 0; j < simpleHands[i].fingers.size(); j++){
ofVec3f pos = simpleHands[i].fingers[j].pos;
fingerPos.push_back(pos);
}
}
}
leap.markFrameAsOld();
}

void testApp::draw(){
cam.begin();
for(int i = 0; i < fingerPos.size(); i++){
ofBoxPrimitive box;
box.setPosition(fingerPos[i].x, fingerPos[i].y, fingerPos[i].z);
box.set(20);
box.draw();
}
cam.end();
}

testApp.h

#include "ofMain.h"
#include "ofxLeapMotion.h"

class testApp : public ofBaseApp{

public:
void setup();
void update();
void draw();

void keyPressed (int key);
void keyReleased(int key);
void mouseMoved(int x, int y );
void mouseDragged(int x, int y, int button);
void mousePressed(int x, int y, int button);
void mouseReleased(int x, int y, int button);
void windowResized(int w, int h);
void dragEvent(ofDragInfo dragInfo);
void gotMessage(ofMessage msg);
void exit();

ofxLeapMotion leap;
vector simpleHands;
ofEasyCam cam;
ofLight light;
vector fingerPos;
};

The output : 


Share:

openFrameworks with Leap Motion example

ofxLeapMotion is an addon to work with openFrameworks and makes it very easy to get hand coordinates by using the prepared ofxLeapMotionSimpleHand class. Nonetheless, it seems that it is not fulfilling enough to utilize the full function provided by Leap Motion's SDK.
However, ofxLeapMotion also provides a way to directly obtain information on the Leap Motion SDK. Using this makes it possible to acquire more varied information.For details, refer to Documentation provided by Leap Motion .
Here I am providing the sample below, we first acquire the hand of Leap, obtain the position of each finger, then obtain and display information on the virtual sphere wrapped around the whole finger.

testApp.h

#include "ofMain.h"
#include "ofxLeapMotion.h"

class testApp: public ofBaseApp {

public:
void setup();
void update();
void draw();

void keyPressed (int key);
void keyReleased(int key);
void mouseMoved(int x, int y );
void mouseDragged(int x, int y, int button);
void mousePressed(int x, int y, int button);
void mouseReleased(int x, int y, int button);
void windowResized(int w, int h);
void dragEvent(ofDragInfo dragInfo);
void gotMessage(ofMessage msg);
void exit();

// the main class of Leap Motion
// vector simpleHands;

// vector array of simple hand model
ofEasyCam cam; // Camera
ofLight light; // Write
vector fingerPos; / / Array of position of fingers
vector spherePos; // array of positions of spheres surrounded by hands
vector sphereSize; // Array of sphere sizes surrounded by hands
};


testApp.cpp

#include "testApp.h"

void testApp::setup(){

ofSetFrameRate(60);
ofSetVerticalSync(true);
ofBackground(31);
ofEnableLighting();
light.setPosition(200, 300, 50);
light.enable();
cam.setOrientation(ofPoint(-20, 0, 0));
glEnable(GL_DEPTH_TEST);

// Leap Motion
leap.open();
}

void testApp::update(){
vector hands = leap.getLeapHands();
if( leap.isFrameNew() && hands.size() ){
fingerPos.clear();
spherePos.clear();
sphereSize.clear();

leap.setMappingX(-230, 230, -ofGetWidth()/2, ofGetWidth()/2);
leap.setMappingY(90, 490, -ofGetHeight()/2, ofGetHeight()/2);
leap.setMappingZ(-150, 150, -200, 200);

for(int i = 0; i < hands.size(); i++){
for(int j = 0; j < hands[i].fingers().count(); j++){
ofVec3f pt;
const Finger & finger = hands[i].fingers()[j];
pt = leap.getMappedofPoint( finger.tipPosition() );
fingerPos.push_back(pt);
}

ofVec3f sp = leap.getMappedofPoint(hands[i].sphereCenter());
float r = hands[i].sphereRadius();
spherePos.push_back(sp);
sphereSize.push_back(r);
}
}
leap.markFrameAsOld();
}

void testApp::draw(){
cam.begin();
for(int i = 0; i < fingerPos.size(); i++){
ofSpherePrimitive sphere;
sphere.setPosition(fingerPos[i].x, fingerPos[i].y, fingerPos[i].z);
sphere.draw();
}

for(int i = 0; i < spherePos.size(); i++){
ofSpherePrimitive sphere;
sphere.setPosition(spherePos[i].x, spherePos[i].y, spherePos[i].z);
sphere.setRadius(sphereSize[i]*1.5);
sphere.draw();
}
cam.end();
}

The output : 



Share:

Wednesday, November 16, 2016

Create UE4 Projects From File Examples

From the Epic Games Launcher, there are a lot of example files you can access and download for free. But once you download, what do you do with them?
Anytime you download any of the project examples from the Learn Section or Marketplace Section, you will find these files inside Library > Vault Section.
Here is how to create projects from these downloaded files. 

1. DOWNLOAD PROJECT EXAMPLE FILES


First you want to navigate to Learn or Marketplace section and download any of the available Engine Feature Samples, Gameplay Concept Examples, Example Game Projects or Marketplace Content.
Learn Section:


2. LIBRARY: VAULT SECTION
You will find all the downloaded files under Library tab and under Vault section. Just scroll down to find them.

3. CREATE PROJECT FROM DOWNLOADED PROJECTS

You can't open these downloaded files directly. You need to Create Project from them. This is to make sure that you always have access to a clean downloaded project example.
In Library > Vault section, click on Create Project:

4. CHOOSE PROJECT NAME AND LOCATION

After you click on Create Project, you have to set up where you want to store this project.
Choose folder where you want to store this project. Default location will be on your C drive. Under: "C:\Users \UserName \Documents \Unreal Projects" but if you have a second hard drive like me and you don't want to have huge Unreal Project file size on your main drive, click Browse and navigate where you want to store your project at.
  • Name the project
  • Set storage location on your computer
  • Create project

5. PROJECT CREATED AND READY TO BE LAUNCHED

You will now see this new project appear under your "My Projects" section in the Library tab.

You can now open this project from here by double clicking on the thumbnail icon or Right Clicking and choosing Open.





6. WHERE TO FIND DOWNLOADED VAULT CONTENT

Download project example files that you see inside Vault section are stored in "C:\Program Files \Epic Games \Launcher \VaultCache":

These downloaded Vault files get very large. So once you have created a project from them or added these assets into a project, you can move and store these Vault files somewhere else or delete them.
I often back them up on a larger drive and not on my main SSD hard drive, which isn't very large.
Once deleted or moved, these Vault files no longer will appear inside Library > Vault section.

7. CONTENT ADDED INTO PROJECTS

If you happen to download assets such as static meshes, sounds or materials you won't creating a project from these. You can add these into a project you already have created and are working on.
Click on "Add to Project":

Choose which project you want to add these assets into:

The assets will appear in Content Browser, when you launch this project.
Share:

Tutorial Third person game for Unreal Engine 4 Using C++


Third-person survival game for Unreal Engine 4 made entirely in C++. Originally built as a 6 section tutorial series, now available as open-source C++ sample project.

NEW: Mod Support
Includes two small mod examples including a Pink Rifle extension and Flashlight replacement mod. Check out the Modding Sample Project for guidelines & cooker profile setup
Example commandline arg to load the Mod gamemode with the built-in level "CoopLandscape" in a cooked game build: SurvivalGame.exe /Game/Maps/CoopLandscape_Map?game=/ExtendedRifleMod/SurvivalGameMode_PinkRifle.SurvivalGameMode_PinkRifle_C
There is currently no supporting UI to load up specific mod content, please note that the modding pipeline is a experimental and intended for early adopters only!
Section 1
This section sets up the third person character movement with animation, object interaction, simple hunger system, all with networking support.

Section 2

Adds weapon support for the character, a flashlight, UT-style inventory with on-character visual representation of the carried items and deals with damage, death and respawns for players.

Section 3

Introduces AI "Zombie" enemy to our game using PawnSensing and Behavior Tree.
Section 4
Introduces a gameloop with enemies, items, scoring and a time of day.

Section 5 

Introduces the ability to carry around objects like barriers and discusses game networking.

Section 6

The final section in the series focuses on bug fixing and a bit of polish to the existing features. This section is compatible with the 4.8 release.
Share:

Tuesday, November 8, 2016

Develop a working osgART application

What is a Scene Graph?

Tree-like structure for organising a virtual world, e.g. VRML
Hierarchy of nodes that define: Groups (and Switches, Sequences etc…) Transformations, Projections, Geometry …

And states and attributes that define:

  •  Materials and textures
  •  Lighting and blending
Benefits for performance

  •   Structuring data facilitates 
    • optimization:- Culling, state management, etc
What is a Open Scene Graph (OSG)?

Open-source scene graph implementation
Based on OpenGL
Object-oriented C++ following design pattern principles
Used for simulation, games, research, and industrial projects
Maintained by Robert Osfield | Documentation project: www.osgbooks.com
Uses the OSG Public License (similar to LGPL)

What is a osgART?

OSG + ARTookit = osgART
osgART adds AR to Open Scene Graph
Current version 2.0, Open Source

To add Video see-through AR:
  •  Integrate live video
  •  Apply correct projection matrix
  •  Update tracked transformations in real-time
Develop a working osgART application from scratch?

Use ARToolKit 2.72 
     library for tracking and 
     video capture
Install OSG

Tutorial 1 : Basic OSG Viewer | OSG 
Install osg (.exe) to your computer
Run basic viewer - located in osg folder in C:\\Program Files (Windows)

Tutorial 2: Adding Video | OSG + ARToolkit
Add a video plugin
Load, configure, start video capture…
Add a video background
Create, link to video, add to scene-graph

Tutorial 3: Tracking | OSG + ARToolkit
Add a Tracker plugin
 Load, configure, link to video
 Add a Marker to track
 Load, activate
 Tracked node
 Create, link with marker via tracking callbacks
 Print out the tracking data

Tutorial 4: Adding Content | OSG + ARToolkit
Now put the tracking data to use!
Add content to the tracked transform 
Basic cube code
e.g. Wavefront (.obj), OpenFlight (.flt), 3D Studio (.3ds), COLLADA
Replace the simple cube with a 3D model
Models are loaded using the osgDB::readNodeFile() function


OSGART sample project by other researchers


Share:

Friday, October 24, 2014

Malaysia UTM 2-Days Professional Course | Developing Mobile Augmented Reality Apps 2014

Hello Malaysia,

Grab this opportunity to learn and develop augmented reality application. This short-course will teach you the fundamental principle of Augmented Reality and the mobile AR apps development techniques.

Ever wondered what an interactive mobile application would look like when viewed through its camera,  virtual object will immersively overlay onto our real world? It looks cool and very impressive. How you would like to make sense of what you encounter? This is a new dimension to mobile experiences where    augmented reality technology will flourish the most. 

We bring to you an expert in Augmented Reality (AR) to guide you in developing your own mobile AR apps. The expert and most knowledgeable professor. 

Date : 29 NOVEMBER - 30 NOVEMBER 2014
Time : 10.00 AM - 05.00 PM
Venue : UTMSPACE KUALA LUMPUR

Contact :
comp.utm.my/mivielab

Booking Now!
Next training : December 2016 | February 2017 | April 2016
* subject to change, please whatapps to confirm





Share:

Monday, February 4, 2013

Exporting Android AR app with Vuforia and Unity3D

Unity Pro or Unity Android required?

Even the extension compatible for both. This is a Unity Pro only Features. It does not work with the Unity Basic version. Sure unfortunately, the limitation from Unity which only allows native plugins in the Pro version of the editor. However there are a few ways to acquire it. But it might require technical aspects.

Unity3D and Vuforia – Tutorial Exporting Android AR App

In this tutorial using Unity3D and Vuforia we show how to export augmented reality app

Unity3D is also available to download from here
Download # Download and Import Unity3D Extension v4.2.3  (*.unitypackage file)
Unity3D extension is also available to download from here
Stand-Alone Android SDK is also available to download based on your OS from here
Download # Money Images for Tracking (Euros)  (*.rar file)
Download # All Money Models with Textures  (*.rar file)


NOTE :

The Vuforia™ SDK allows you to build vision-based augmented reality applications. It is available for Android, iOS and as an Extension to Unity - a cross-platform game engine.

A Vuforia SDK-based AR application uses the display of the mobile device as a "magic lens" or looking glass into an augmented world where the real and virtual worlds appear to coexist. The application renders the live camera preview image on the display to represent a view of the physical world. Virtual 3D objects are then superimposed on the live camera preview and they appear to be tightly coupled in the real world.

For more information see www.vuforia.com.

An application developed for Vuforia will give your users a more compelling experience:
  • Faster local detection of targets
  • Cloud recognition of up to 1 million targets simultaneously
  • User-defined target for run-time target generation
  • Robust tracking – augmentations stick to the target and are not easily lost as the device moves
  • Simultaneous tracking of up to five targets
  • Better results in real world conditions – low light, partially covered target
  • Optimizations that ensure better and more realistic graphics rendered on the target
The diagram below gives you an overview of the application development process with the Vuforia platform. The platform consists of the Vuforia Engine (inside the SDK), the Target Management System hosted on the developer portal (Target Manager), and optionally the Cloud Target Database.

How do you run the Android Extension?
You’ll be able to use Play Mode for Vuforia, in Unity by following these steps:
  1. Install the Vuforia AR Extension for Unity release 2.0 into your project.  It will be available from the Unity Asset Store and from the Vuforia developer portal, as usual.
  2. Build your AR app.
  3. When you’re ready to test, click the play button in the editor and hold your target in front of the webcam.  Your scene will run in the game view as if you were running the app on a mobile device.
Share:

Malaysia Augmented Reality

Its incredibly difficult to discover the undiscovered. Here will see how I can help you. Cheers - Ajune (comp.utm.my/ajune)
ajune@utm.my. Powered by Blogger.