This article will explain how to use Unity and Vuforia to make an Augmented Reality app for your Android device. As a beginner, this is where I had some problems. To create the Android app was as simple as recompiling the Unity project for the Android platform. The app failed to build on Android Studio. The next problem, the app would not switch to the video after detecting the targets.
Basically, the application uses photos of the Seattle Seahawks’ logo and Russell Wilson as target images. When the app detects the target, the app switches to the videos of the Hawks famous plays.
The application worked with Unity 2017.4.1.f1 and Vuforia SDK v7.1.34. This article does not cover running the Unity and Vuforia on the Windows 10 desktop, because “Playful Technology” already has an awesome video on youtube.
Unity and Vuforia Tutorial on Windows 10
There is no way I could explain it better. The author from “Playful Technology” does an excellent job. The video was published February 2018. As of April 2018, this was one of the latest tutorials.
Unity and Vuforia are changing their software a lot. I started out trying the make the app based on a tutorial from September 2017, but it was out of date.
Devices used
Windows 10 PC, Galaxy Tablet A. Android version 7.1.1.
The Unity and Vuforia Project
The project is real simple. I am using photos of Seahawks’ logo and of Russell Wilson as my targets. These 2 photos were added to “Target Manager” database from the Vuforia website., After you create the database, import it into Unity. When you run Unity, the AR camera detects the targets, and the app shows the Seahawks monumental plays. GO HAWKS!!
Unity and Vuforia Tutorial on Android
The 2 problems I encounter were;
The project would not build for Android
Videos attached to the markers would not play.
SOLUTION
For any problems with the app check Unity website first. A lot of information is under the Unity “Community” link. Both of the solutions were found under “Forums” and “Answers” which is under the link.
The Build Problem
The Unity/Vuforia project failed to build with jdk-9.0.1. The Unity Forum explained how to fix the problem in Android Studio, by changing the jdk-1.8.0_131. Yo do this by loading the Unity *.apk project into Android Studio. Then go to File > Project Structure and change the SDK setting as shown and change the jdk-1.8.0_131.
The Video Transcoding Problem
The next problem was the app was not switching to the video after it detected the targets. The video had to be transcoded and compressed. Without transcoding the video, the Android device was not able play the video. The video had to be compressed so it can run on the Android device. The original size of the Android executable was 126 MB. Transcoding and compressing the video cut the size down to 66 MB.
Video Of the App running in Unity And Vuforia on Windows and Android
To get the most benefit from this blog you should have some background in the following areas; Computer Vision applications with OpenCV and marker tracking, C#/C++ programming.
Introduction
In this article I will demonstrate how to develop an Augmented Reality (AR) application to move your CNC machine. I call application the AR Joystick. The AR Joystick interfaces with a camera and displays the video. It operates by searching for markers on the machine. When it detects the markers, the app draws a green box around markers and draws the lines for the X and Y Axis. If the box is not drawn the marker is not detected. The markers determine where the machine can move.
The initial idea came from the book “Arduino Computer Vision Programming”, Chapter 9 “Building a Click-To-Go Robot”, by Ozen Ozkaya. The chapter explains how to write an video application that moves a robot from a video window using the mouse.
About this blog subject
This blog could easily generate at least 10 additional blogs. I will make a series for the additional blogs and make this blog an overview, but I will spend more time talking about the software. Otherwise the blog will be too long for anyone to read.
The Setup for the AR Joystick
Here are the parts used.
PC running Windows 10 x64
OpenCV 3.2.0 – Aruco (Augmented Reality Library from the University of Cordoba)
Visual Studio 2015 Community Edition
An old CNC machine.
Dynomotion KFLOP motion controller.
Dynomotion Snap Amp DC Servo drive.
Logitech WebCam
2 Laser Markers from Harbor Freight
XBox Joystick.
3 AMT102 encoders.
Power supply 24 VDC.
Refurbishing an old CNC machine
This is my 3rd machine that I have worked on. Refurbishing the machine went faster than expected, because most of the parts were still working. Only the DC Servo motors were outdated, they were using analog tachometers. The tachometers were replaced with Encoders from CUI, Inc.
I used the Dynomotion KFLOP Motion Controller again for this project. What is different with this new machine between the previous CNC machine? I used the Dynomotion Snap Amp to drive the servo motors instead of the Gecko drives. The Snap Amp was easier to use.
Writing the software for the AR Joystick
The AR joystick uses 2 programs; CNC Machine Client, and CNC Video Server program. The client is written in C#. The server is written in C++. The server program tracks the markers to set up the origin, X and Y axes, and tells the client where to move.
CNC Machine Client program with the Xbox Game Controller.
The CNC Machine client software uses the Xbox Game Controller to move the machine.
The client moves the machine and displays the XYZ location. When the client is connected to server, the server tells it where to move. When it is not connected to the server, Xbox joystick controls the client. To connect the client to the server. Hit the “Connect Pipe” button.
This is what the client looks like.
CNC Machine client
The CNC Video Server Program.
This is where the fun begins. This is where we apply Augmented Reality and Robot Computer Vision to the project.
The “CNC Video Server” shows 2 images of the same scene. The image on the right is the perspective view. The image on the left is the 2D view. The server acquires the video as shown on the right and transforms the image into 2D using the warpPerspective OpenCv command.
The image on the left is where the user controls the machine movements. All the user has to do is click the mouse in the video and the machine moves!!
CNC Video Server
Augmented Reality ARUCO markers to move the machine
The main purpose of the server is to track 4 ARUCO Markers to set up a machine coordinate system based on their orientation. Each marker has a specific purpose;
Marker 1 is the origin.
Marker 3 specifies the X-axis.
Marker 2 specifies the Y-axis.
Marker 4 is optional.
The green lines in the video are the X and Y axis. The red lines you see are projected from the laser markers mounted to the machine. These markers show the actual machine position in the live video.
Video Server 3D Perspective View
Video Server 2D View
The server aligns the perspective image into a 2D image. The distance between the markers is known to the server. It defines the scaling, pixels/mms, for each axis.
When the user clicks the mouse in the 2D window, the server detects the pixel XY location and converts XY pixels into inches. Next the program sends XY values to the CNC Client. When the client receives the XY values, it will move the machine in the specified XY coordinates.
Applying a perspective Transform and warping the Live Video.
The OpenCV Server displays 2 images of the same scene. One window shows the perspective view the other shows a 2D view. Here is the OpenCV snippet that transforms the video.
The vector pts_corners are the 4 center points of the AR markers in the perspective view. The term “vector” refers to the C++ Standard Template Library data structure.
The vector pts_dst are the 4 center points of the AR markers but in the 2D view. Both of these vectors are used to find the Homography matrix. This matrix is used to map the 3D image onto a 2D image.
Mat h = findHomography(pts_corners, pts_dst);
warpPerspective(imageCopy, im_out, h, imageCopy.size());
imshow(“XYView”, im_out);
Handling Mouse Events in OpenCV
The OpenCV code snippet for handling mouse events is implemented by using a callback. A callback is a pointer to a function. When the user clicks the mouse in the 2D window, the server generates an event to the callback function to process. The callback function returns the location of the mouse. The code is very common on other blogs and articles. The code will look like the following snippet.
setMouseCallback(“XYView”, CallBackFunc, &stagePt);
The callback function will look something like
void CallBackFunc(int event, int x, int y, int flags, void* ptr)
{
if (event == EVENT_LBUTTONDOWN)
{
// Do something
}
}
Details that I am leaving out for Now
I am leaving out a lot of details. The details will be covered in the next blogs if someone want more information. Otherwise the blog would be too long.
How to Use the Omron EE-SX671 Limit Switches (Part 2)
This article explains how to use these switches.
Creating ARUCO Markers for the Coordinate System (Part 3)
You will need to create the 3 or 4 markers for coordinate system. For more information refer to OpenCV Basics 21-Aruco Marker Tracking on youtube.
Camera Calibration (Part 4)
Calibration is very important for software. Without calibration the machine movements would not be as accurate. For more information refer to OpenCV Basics 14-Camera Calibration on youtube.
Updating an Old CNC Machine: Servo Tuning for KFLOP and Snap Amp (Part 5)
If anyone is interested in a blog about this subject let me know.
Video camera Controller
The camera is simple Logitech 1080p webcam. It costs about $70. To write software to control camera refer to the OpenCV Basics 12- Webcam & Video Capture on youtube.
Using Pipes to communicate between the programs.
The named pipes were used for the Client and Server to talk to each other.
Limitations:
I need to emphasize; THE SOFTWARE IS NOT PRODUCTION CODE. I would not put the code in mission critical applications. The software is only a prototype and was only written to prove a simple concept. Also accuracy of the stage movements is not great.
Credits
AMT102 Encoders from CUI Inc.
George Lecakes – OpenCV Tutorial on YouTube. I highly recommend these videos. There are 21 videos each one is about 10 to 15 minutes long.
OpenCV Basics 01- Getting setup in Visual Studio 2015 for Windows Desktop.
OpenCV Basics 11- Building OpenCV Contribute to CMAKE.
OpenCV Basics 12- Webcam & Video Capture.
OpenCV Basics 14-Camera Calibration Part 1, Part 2, Part 3, Part 4