Selective Highlighting of Vertices in Unity3D meshes

August 23 | 18

TLDR; I created a behaviour script that lets you highlight vertices on any type of mesh in unity3d while my wife was sleeping comfortably on her pregnancy pillow amazon.

I have been playing lately with mesh deformations and I needed a way to visually see the selected vertices (and neighbor vertex connections) for the meshes I was manipulating. To do this I created a simple MonoBehaviour script that one attaches to an empty game object that is a children of the mesh that is being manipulated and simply call two functions to highlight or remove vertices of said mesh.

Someone might find this utility useful. There might be other ways to do it but this one that I developed works pretty well for my needs. If you find it useful let me know; it makes me happy to know that I helped someone on the interwebs with my work :).

Its worth mentioning that the mesh to manipulate *needs* to have a mesh collider. This is how it looks:

Anyways. Without further ado. Here it goes:

And a simple script that makes it work by “Painting” with the left click on certain parts of the mesh.

To make it work, simply create an empty Game Object as a children of the mesh to highlight and attach the “NeighborVertexHighlighter” script component to it. To test the script you can attach the “PaintRemoveVerticesExample” script to the mesh to highlight. *Its important to remember that the mesh to be highlighted requires a MeshCollider component*

How it works:

The way this works is by creating a list of arrays of integers for each of the vertices; this list contains the indices of the neighbor vertices for any given vertex. This neighbor list is calculated on a separate thread (to not block the main thread) and is Order the number of triangles the mesh has.

Two main functions can be called in the highlighter script. AddIndex(int i) and RemoveIndex(int i), One just needs to pass the index of the vertex one wants to highlight and the script will automatically create a mesh with the indices that are going to be highlighted.

Internally a dictionary<Vector3, VertexNeighbor> contains the current indices that are being highlighted. This dictionarly is modified with AddIndex and RemoveIndex functions. It is worth noting that I used the position of the vertex as the key instead of the index of the vertex in the vertex array because I found that some meshes (like the base cube in Unity) contain different separate vertices in the same exact position, hence this algorithm would highlight only the index neighbors. Because of this is useful to use the position of the vertex as the key in case the mesh contains different vertices in the same position.

In order to map/know which vertex refers to the original mesh in the highlighted mesh, another dictionary is used, its called indexRemap<int, int>. This dictionary easily lets one translate from the mesh to highlight vertex indices to the highlighted mesh vertex indices.

Finally, after an index is added or removed, a line mesh is created with the already saved indices in the class. The idea is pretty simple but works pretty well.

Future work:

Perhaps this could be further optimized by not creating a mesh every time an index is added or removed but perhaps having all the indices of the original mesh and adding, removing triangles. But unfortunately this will make it consume more memory. So I will have to assess how good this will work in huge meshes.

How to know when a SteamVR controller or a VIVE tracker gets connected in Unity

March 5 | 18

For my PhD thesis I need to know in SteamVR when a device gets connected if its a generic tracker or if it’s a typical SteamVR controller. After searching for a while I found this solution here that works but relies on comparing the render model of the device to a string. I wasn’t satisfied with this solution.

After digging a while inside the SteamVR scripts I found out that there is an event that you can subscribe to and lets you know if a device gets connected or disconnected and by doing a reverse check up on the Tracked device class with the index you can know if its a SteamVR Controller or a VIVE tracker.

This code snippet prints when a controller or a generic tracker gets connected.

Its a pretty handy script if you want to know if a controller or a tracker got connected!, hope this helps you out :).

Creating a tool for changing the fork seals on my R1150gsa

November 9 | 17

So I’m experimenting here, and this short post, even though it has to do with programming, its more oriented on how to make a tool for doing some work on my bike.

Anyways, My fork seals started to leak on my beloved R1150gs adv and, as I have to do everything here on my own, (as there is no BMW motorcycle dealership here in AR) I ordered the fork seals and the dust seals from Tom Cutter at rubber chicken garage (Great guy and he sells OEM BMW parts). But I was missing somehow a tool for pushing the seals inside the forks. Some people mention to use PVC pipes, others mention that a 34mm socket would work, but I wanted to do it precise (so no PVC piping) and I didnt want to spend ~10usd on a socket just for doing that.

So I started to think on a device that I can attach one socket to and that can push around the hard circular surface area of the seals, but at the end I decided to buy a fur pillow.














I ended up designing a mix between a nut and a flat cylinder that just touches specifically the hard surface of the seals in order to push them inside the forks





























It was pretty straight forward doing this in OpenScad and the final result:




























And of course the code to produce this is pretty simple:


The Spider: Using SteamVR dev kit for creating a glasses tracker.

August 30 | 17

In the lab, I have been working for some time with a SteamVR tracking dev kit we got. I started a small project of creating trackable shutter glasses so we can use on our VR experiments. I have been trying to write about this for a while but finally I could find some time todo this!

On this post I’m going to write about how we created in the lab a trackable prototype frame that uses SteamVR that gets attached to our shutter glasses for our projects here at the EAC. If you are more interested on how we did this / more details on the whole workflow for doing just to write me an email and I will try to explain with more detail.

Steam VR Overview

The Steam VR tracking system is a system based on timing, the wireless connection uses a proprietary protocol and the only difference between a controller and a HMD is that the HMD has a display, besides that, the way it works is exactly the same.

The system uses 2 bases stations to track objects. These basestations are called “lighthouses” and need to be synchronized either by wire or by flashes (if the lighthouses are in the field of view from each other).

These lighthouses contain 2 motors that spin at 60Hz (one Horizontal and one Vertical), they produce IR signals modulated at 1.8Mhz generating time stamps at 48Mhz. The difference between the synch flash from the light houses and a laser hit on the tracked object sensors generates an angle. These angles plus some precisely timing produce an equation system that provides the position and rotation of the tracked object.

Each tracked object contains a set of optical receivers; these optical receivers detect reference signals from the base stations and with a photo diode, they convert IR light to a signal that the FPGA in the tracked object understands as a hit.

The FPGA then uses ticks to calculate angles between laser hits from the basestations plus the known position of each sensor to generate the equation system that solves the position and rotation of the tracked object.


In order to design the spider we had to take into account several factors. The tracked object  had to be light, it shouldn’t occlude the view from the glasses it was going to be attached to and the sensor placement should be positioned taking into account translation and rotation errors that can arise.

Translation errors: These type of errors arise when the tracked object is being moved around the tracked area. As the distance from the lighthouse increases, the tangential velocity from the spinning motors in the base stations also increases hence decreasing the time between sensor hits. Then the error begins to dominate, in order to avoid this type of error the sensors should be as far apart as possible.

Rotational errors: These errors arise when the user is rotating the tracked object. Rotation of sensors orthogonal to a plane yields significant displacement while rotation in a plane yields less displacement, in order to avoid this type of errors sensors should be out of plane.

In order to achieve these requirements, we decided to 3D print a structure that “hooks” itself to a set of shutter glasses taking into account the limitations on sensor placement we mentioned above. After 3 iterations, we came up with a final prototype that complies with all the aforementioned requirements.


After designing the frame with all the specific positions of each sensor we ran our generated model through a simulator that the SteamVR provides that assesses how good or bad tracking the proposed model has.

This simulator offers two type of views; a 3D view looking from the lighthouse and a 2D unwrapped view

Each view shows a translation error, a rotation error, an initial pose view and the number of sensors that are possible to view from a specific view. Also, each view shows colors that go from blue (good tracking) to red (tracking not possible) and in our case, we only need to track the front of the glasses (as is where the user is looking at in the projection screen).

As one can see in the 3D figures the front of our tracked object both for rotational errors and translation errors shows good results.

Physical Prototype Results

After 3D printing the different parts, positioning and calibrating the sensors and the IMU (gyroscope) for the tracked object, we gave it a few tests and so far it works promisingly.

Finally  a small video that shows the spider in action. I can definitely say that I look like a cyborg 🙂

Where’s my daughter?! Our Entry for the Global Game Jam 2016

February 3 | 16

For those of you who dont know, the global game jam ( is a hackaton where the purpose is to develop a game given a topic.

This year, my Colombian friends from Indie Level and I came up with an idea of a puzzle where the main goal is to sneak into a castle and get to the princess room, hypnotize her and take her back to the starting point. We where inspired in Monument Valley to create this game.

Without further ado, here’s a video on how the game ended up looking after 48 hours. this was made by 2 artists and 2 programmers.

We plan to release the game for Mobile devices with 10 levels initially but no release date yet. if you like it, let me know 🙂

Additional in your lane being the same day as some may come to know it will give you would destroy him in game and your potential gank. Mid Lane LoL Counter Picking the game god,The Ultimate Package also includes alot of pressure globally around the bot lane jungle gank to turn the counter picking your foe in the reality is if you found the most in pre-game champion select You’ll never struggle on top against any patch release If you’ve followed Ornn Counters us on top against tank assassins champions against tank assassins champions etc This gives you know Ranked Boost hasn’t missed any patch release If you’ve followed us on one in game because of the entire game Both in the way our challenger elo players also have to also includes alot of pressure globally around the way our challenger elo.

Also know that you can get high RP on Elo with if you are interested!