Procedural city visualization and moveable agents coverage.

April 18 | 16

AgentCoverageNiceViz

For the information visualization class I had an old project that I wanted to revive; it was called the “ambulances” program; it was a project where we had to come up with an algorithm to visualize ambulance coverage in a city for Data Structures and Algorithms class back in my college years.

The whole idea of the class project was to visualize information and the whole idea of the ambulances project was that given a set of N ambulances, where to place them in a city graph in order to minimize the response time for each moveable agent (ambulance).

Unfortunately the data that I had didnt had spatial coordinates so I had to generate the data by my own. I used a modified version of CityGen to dump the generated data for the generated cities, and then I created an app in Unreal Engine to be able to visualize the generated data and my algorithm.

The idea of this project besides visualizing information was also to go and push myself to learn some Unreal Engine and rust off my C++ skills :-). It sounds complicated but it’s as easy as reading a baby stroller review on your phone.

At the end the project was finished and I could visualize that the algorithm that we created worked more or less good for circular cities but for elongated cities didn’t work as good.

If you are interested in how this project works, feel free to fork it from here. Also if you are interested on what specifically I tried and what experiments I did, feel free to read the report I made (full of images), else just see some of the images next.

AgentCoverageGreenToRed

Coverage per agent on a response-time visualization, for this agent organization and supposing you live in this city, you might want to avoid the places with red nodes in case you get sick often as ambulances will take the most time to reach those places hehe ;)

CityBaseViz

Basic procedural city load into Unreal.

ElongatedCityProblems

Algorithm failing on elongated cities; notice how 2 agents are close to each other (not ideal).

MainPage

17 agents with their respective clusters being covered.

NodeCoverage3Agents

3 Agents being visualized with their respective clusters, you can see that the brown cluster is bigger, this is not a good sign the algorithm is working as expected.
SpeedVisualizationFinally some segment speed visualization; the greener the faster the yellower the slower speeds are in each segment.

 

Introducing Potel: A pottery maker simulator for VR.

January 11 | 16

Potel is a small VR application that focuses on exploring the art of pottery creation through virtual reality. Inside Potel, users are able to create virtual pots with the aid of a spinning wheel that lets them shape virtual “clay” the same way you would do if you where making pottery in real life.

Screenshot3

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The way the players interact with the world is with their hands by modeling pottery to their will either by pushing or pulling an initial blob of clay.

We use a Leap motion for tracking the user’s hands movements in the world. An Oculus Rift to immerse themselves in the generated scene and an optional in house hand vibrators that let players know when they are interacting with the world.

Screenshot1

 

 

 

 

 

 

 

 

 

 

 

 

 

Finally after the desired shape has been created the user can 3D print the created art.

UP3Ck6If you happen to have an Oculus rift and a leap motion, feel free to download it from here.

We have noticed a big portion of the users that tried the system without haptic feedback (but knowing about the depth cues) try to place their hands just over the surface of the clay on the first experiences.

Some other users after the first attempts when they are pretty immersed in the environment try to reach a bucket of water that is next to the chair as a prop thinking that everything in the world is inter-actable.

The system currently supports clay deformation for each finger but we decided to stick to the palms because sometimes the leap tracker gets lost and the clay model gets affected because of tracking issues.

 

Here’s a small video of the app working:

And in case you are interested in a more in depth explanation on how the system works, you can read a paper we wrote in here.

If you like it, let me know :).

How to know if a point is inside a polygon

April 16 | 15

Here’s a quick code snippet to know if given a point, the point is inside an array of polygon vertices. Based on the crossing number algorithm.

 

This can also be extended to 3D space if necessary, but that’s for the reader.

Mixing GUILayout and GUI and making them coexist together (Useful when editor scripting)

January 20 | 15

Warning:

Even tho Unity 4.6 has come out with the new Unity GUI, you still need sometimes to use the old legacy GUI and GUILayout API specially if you are working with editor scripting.

Being said that, DONT use GUI/GUILayout functions for GUI on your projects (except editor scripting) if you are using Unity 4.6+; using GUI/GUILayout  in your actual game is slow and consumes lots of draw calls!.

 

So now that everything is clear here’s a little bit o what happened:

I am programming a new tool for the asset store and I was needing to display some textures in the inspector view (after all the GUI was written using GUILayout…), but Unfortunately “EditorGUI.DrawPreviewTexture()” doesnt have a layouted version :/ and I didnt wanted to do the whole same thing again using plain GUI. so I came up with an interesting idea that I wanted to share on making GUI and GUILayout coexist in the same GUI :).

So I’m going to expose having GUILayout code and inserting GUI code. and making them both coexist in the same Repaint().

 

Mixing  GUILayout with GUI

this happens when we have lots of GUILayout calls same as when we are building our window with pure GUILayout calls and there are some functions that are available only for GUI.

In my case I was wanting to create a tool for displaying the materials’ textures of any given object and some small info:

So the tool is organized and also looks like this:

ProMaterialCombiner

 

 

 

 

 

 

 

 

 

 

 

and it even maintains its proportions! 😀

Resizeable

 

 

 

 

 

 

 

 

 

 

 

So how do I do it?, its actually pretty simple, basically what I create is an area (with GUILayout) and inside the area I create a ScrollView (also with GUILayout) and then in the inside we put all our GUI (relative to the Area we created and not to the window) and we fill everything with GUILayout.Spaces(px)!

 

Sounds complex? its not at all!, trust me :),  let me show you a really simple code:

Basically what we need to do is to create an empty area from the GUILayout space and fill it with “spaces” that are consistent with what we draw on the GUI space. Let me draw it for you:

CombinedGUILayoutAndGUI

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Mixing GUI with GUILayout.

Mixing GUI with GUILayout is basically the same process you just have to take into account the space consumed by the GUILayout and then just offset the other GUI objects so they can coexist.

I’ll leave this part to the reader tho.

 

Like this post?, feel free to subscribe to my posts, I’ll keep you posted on anything that I write :).

Any comments/suggestions/feedback let me know :).

 

 

Reducing Draw Calls (also named SetPass calls) on Unity5.

January 10 | 15

So I have been working on the standard shader support for Unity 5 and doing some tests regarding how feasible is to reduce draw calls with the new standard shader and actually the results where pretty positive!.

Lets talk about the standard shader a little bit. The standard shader is a physically based shader that is consistent across diverse lighting conditions, its multi-platform and texture slots enable different shader features.

Here’s a small introductory video on what can be achieved with this one new shader.

 

 

This standard shader “mutates” and becomes more faster/slower depending on the textures you are using on the texture slots. being said that these two shaders are different (even if they are using the standard shader):

StandardShaders

So these two shaders (even if they are “Standard”); they are totally different shaders when rendering a scene, hence they have to be handled totally different if we want to reduce some draw calls (Now called SetPass calls).

Before Talking about how to reduce draw calls in Unity 5 I will talk about how draw calls are reduced regardless the shader is being used and then I will talk about how to reduce draw calls with the Standard shader in Unity5.

 

Reducing Draw Calls in Unity The Manual Way.

So, if we want to reduce draw calls in Unity or in any other engine the way to do it is using as less materials as possible.

Basically it all gets reduced to these single steps:

 

  1. Sort all the materials and gather them by the type of shader they are using.
  2. With the materials that share the same shader gather their textures in a bigger texture (atlas texture).
  3. Create one material that will contain the shader and the atlas texture(s) created on step 2.
  4. Remap all the UVs of your meshes that use the shader to fit the coordinates of the atlas texture.
  5. Assign to your remapped meshes the material we created on step 3.

 

 1. Sort all the materials and gather them by the type of shader.

I’m going to use for easiness of explanation purposes one simple scene that will have all the meshes use the same type of materials but different textures.

In our case we are going to use this scene (which contains 4 materials each of them with a Bumped Diffuse shader):

0SampleScene4Materials

 

 

 

 

 

 

 

 

 

 

 

Notice the SetPass calls: 4

So we have  for this scene these materials to process:

SceneMaterials

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

2. With the materials that share the same shader gather their textures in a bigger texture (Atlas texture).

In our case we have 2 textures per shader as the Bumped Diffuse shader uses a normal map and a base texture, so what we need to do is to have 2 textures like this:

AtlassedTextures

 

 

 

 

 

 

 

 

Save these textures in the project view, we are going to need them later on:

ProjectView

 

 

 

 

 

 

3. Create one material that will contain the shader and the atlas texture(s).

Create a material and assign (on this case) the two textures to the material with the shader used (in our case “Bumped Diffuse”):

3Material

 

 

 

 

 

 

 

 

4. Remap all the UVs of your meshes that use the shader to fit the coordinates of the atlas texture.

This modification can be done on any 3D editor (Maya/ 3D studio / Max / Blender / etc) of your choice.

This is the most boring/demanding step as it involves you modifying your UV coords of your meshes and make them match the  texture’s atlas pixels (the texture we created on step 2).

So what we do is that to each of our UV coords located all around [0,0 – 1,1] we Re-position them on a smaller sub-set of [0,0-1,1] that matches the pixels on the atlas texture we created on step 2.

4UVREMAP

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

And… we do this for each of our meshes that share the same shader.

 

 

 5. Assign to the remapped meshes the material created on step 3.

We are almost there! still we have a couple of more steps to do:

 

  1. Select all your UV-remapped-meshes and place them where the old meshes where located and deactivate the old meshes.
  2. Assign the created material on step 3 to all your UV-remapped-meshes.

5ObjectsSameMAt

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

And by here, we are finished!, now lets hit play:

5OptimizedScene

 

 

 

 

 

 

 

 

 

 

 

 

Before Optimizing if you go back to Step 1, we had 4 SetPass calls and after all the optimization has been done we end up with 1 SetPass calls!. which is amazing! its 75% draw call reduce on this case.

 

Reducing Draw Calls in Unity 5 with the Standard shader.

By here you should know more or less how the optimization process is done with the standard shader. as I wrote on the beginning,  the standard shader internally “mutates” and becomes more faster/slower depending on the textures you are using on the shader’s texture slots (and also renders differently depending on the textures that are used). being said that we have to gather our object’s materials if they are using the Standard shader by the textures used. i.e: Gather all the meshes that only have colors; Gather all the meshes that only have an Albedo texture.. and so on.

After gathering our objects by the textures used we just need to follow the same steps we use for any other shader:

 

  1. Sort all the materials and gather them by the type of shader they are using (In the standard shader case gather them by the textures used).
  2. With the materials that share the same “shader” gather their textures in a bigger texture (atlas texture).
  3. Create one material that will contain the shader and the atlas texture(s) created on step 2.
  4. Remap all the UVs of your meshes that use the shader to fit the coordinates of the atlas texture.
  5. Assign to your remapped meshes the material we created on step 3.

Here’s a small scene I created for testing purposes:

StandardShaderScene

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

If you compare the original draw calls (91) and the optimized scene’s draw calls (22) you can see that there’s ~75% draw call decrease!. Which is amazing!

Also, if you can try to combine meshes as much as possible, this also helps a lot!.

Reducing Draw Calls on Unity Automatically

Knowing that this is a really cumbersome process I decided to create a tool that automatically gathers your objects and sort them by the type of shaders they use and automagically “Bakes” all your unoptimized scenes!, you can find the package here, and if your scene only uses Diffuse and Bumped Diffuse shaders please be my guest and use it for free!

 

 

 

Like this post?, feel free to subscribe to my posts, I’ll keep you posted on anything that I write :).

Any comments/suggestions/feedback let me know :).