Saturday, February 13, 2016

[Note] Bidirectional Path Tracing

Extending from my previous post [Note] Light Transport Path Integral, We already knew PDF for each sampled path is the joint PDF of path vertices, and the PDF for each path vertex is based on its BRDF.  However, there is no PDF for connection edges or for the final segment of the path that is just connected directly to the light or camera based on visibility. With Monte Carlo, if we just do the integration with the PDF from one of the path sampling algorithms, the estimator might have extremely high variance because each PDF can't cover all conditions, it only covers part of the domain.

















 Screenshot from 2013 Siggraph course by Jaroslav Krivanek

No matter what kind of path sampling techniques used (Path Tracing or Light Tracing...), non of them can properly sample each term of the measurement contribution function. Either it doesn't sample camera sensibility, light emission, or there is no way to accurately sample the connection of two sub-paths.

Bidrectional Path Tracing basically combines different sampling techniques , connects sub-paths, and utilizes "Multiple Importance Sampling (MIS)" to deal with the missing PDF for the connection. Like we just said, every sampling algorithm only covers part of the domain, there might be paths that should contribute a lot to the final image but it only gets subtle attention.















 Screenshot from 2013 Siggraph course by Jaroslav Krivanek

As you can see, if we only consider Pa(x), for sampled path Xa, its contribution function actually is quite high but the probability of it is super low. With BPT which uses MIS on multiple sampling techniques, we can have a more accurate PDF for the contribution function, which in turns give us a better render image.


Monday, January 5, 2015

[KATANA] Asset API

Concept:
                    asset plugin
Katana <---------------------->  Asset Management System 
                 retrieve, publish
             browser & parameter widget customization


Workflow:
           shot#, seq#, ...            ------>   Asset ID 1  ----->   File Path
asset fields (node parameters)      buildAssetId       resolveAsset

if for retrieving asset, just load the asset/scene
if for publishing asset,
------> Generate Asset   ------>  Asset ID 2 (with increased version or override current one)
    createAssetAndPath                                postCreateAsset(... assetType, fields, args)


Monday, April 21, 2014

Facebook + Unity + Parse


Recently I got some time to do some tests about how to communicate among Facebook, Unity and Parse. It worked quite well. Above is a diagram I made to demonstrate the relation among them.

For Facebook, we need to create an app first which would come with an App ID and Name. For Unity to communicate with Facebook, it has to import Facebook SDK for Unity package. Once Unity knows what Facebook App ID and Name is via Facebook setting in Unity, the link between them is settled.

When player log in the app as a Facebook user, the app would execute FB.LogIn. To get user profile info, use FB.API with HttpMethod.GET, the callback function will give us FBResult which is in JSON format that can be easily parsed. FB.Feed is for posting on wall for sharing. FB.AppRequest is for friend invitation. FB.API with HttpMehod.POST can be used to have custom activity post.

To communicate between Parse and Unity, we need to create a Parse App first. For Unity, we have to put Parse.Unity.dll into plugins folder. With Parse-given script, copy and paste Parse Application ID and .NET Key into Unity. Once player log in as a Facebook user, ParseFacebookUtils.LogInAsync would help save user data into Parse DB. Retrieving data back from Parse is also quite straightforward, just use ParseObject.GetQuery with given constraints. To change DB data, use ParseObject.SaveAsync.


Sunday, December 15, 2013

[Note] Light Transport Path Integral

From previous post, we've known what rendering equation is:

    L(x, w) = Le(x, w) + Integral(BRDF(x, w(x, x'))L(x', w(x', x))G(x, x')V(x, x')dA'

    x: surface element receiving incoming lights
    w: reflection direction
    w(x', x): direction from x' to x
    G: geometry factor
    V: visibility
   
This is the calculation based on a point on a surface. When producing final image, it actually involves light paths from the light sources to the camera.

Take a look at the (L) inside Integral, this incoming light for point x is also a reflection light for point x', and for point x' itself, it has its own rendering equation. That is to say, if we replace each L inside integral with its rendering equation repeatedly, we will get a multi-integral equation which is called "Measurement contribution function":

 

Screenshot from 2013 Siggraph course by Jaroslav Krivanek

the greek letter rho is actually BRDF

In order to solve this multi integral equation, the general approach is using Monte Carlo integration.


 Screenshot from 2013 Siggraph course by Jaroslav Krivanek

 The main idea of Monte Carlo is utilizing sampling to approximate solutions. For reducing variance, there are many ways to generate samples. here we talk about importance sampling. It is based on probability density function (pdf). Basically, it generates more samples on the region where contributes more to the result, and less samples on the part which doesn't influence the result that much. And because for the region with more samples, the interval of each sample should be smaller, that is why it uses f(x)/p(x) to have proper weight.

Now come back to the light paths integral. With Monte Carlo, we need to know how to properly sample the light path, and what is the probability density of that sampled path to have Monte Carlo estimator ready to be evaluated.

About path sampling, there are different algorithms (ex: path tracing, light tracing, and bidirectional path tracing). Here uses path tracing for instance, the sampling process is based on what is called "local path sampling", which means sample one path vertex at a time. The procedure has three main steps:

 Screenshot from 2013 Siggraph course by Jaroslav Krivanek

 (1) sample a path vertex from the camera, if using light tracing, sample from the light instead. For camera, usually sample with uniform distribution over the lens surface; for light, sample based on emitted power distribution

(2) shoot a random ray thought image plane to extend the path and get the intersection point

(3) from this intersection point, either sample a direction to shoot another ray based on the BRDF repeatedly until reaching the light source, or test the visibility with the light source and connect them together to form a complete light path

Since local path sampling importance sample the contribution function associated with each path vertex, we can be sure that the path with higher contribution would have more possibility to be sampled.

About the probability density of the sampled path, it's quite straightforward,
    p(x) = p(x0, x1, x2, x3) = p(x3)* p(x2|x3) * p(x1|x2) * p(x0)
    x stands for the whole path
    x3 is camera, x0 is light (assume both are sampled using uniform distribution), x1 and x2 are other path vertices along the way

Simply put, it's the product of conditional pdf of each path vertex.


Friday, November 22, 2013

[Unity] Mobile Test Setup


After 2 years without using Unity, got to grab some memory back :)

Android
1. Install Android SDK
2. use Android SDK Manager to install platform files & USB Driver (platform 19 seems not working, I used 17 instead)
3. enable "Unknown Sources" on Android to allow non-GooglePlay apps installation
4. install any file manager for the device (I used the one from rhythm software)
5. build .apk in Unity and copy it into the device
6. use file manager, navigating to where the apk is.  Install, Done!

iOS
On Unity side, just go to the player settings, make sure the bundle ID matches the one in provisioning profile, target iOS version should be set by default to the oldest.

http://www.brianjcoleman.com/tutorial-provision-your-app-to-run-on-a-device/

I found this tutorial written by Brian Coleman quite helpful. The only few things it doesn't mention is you might need to set up the deployment target in Xcode and make device "use for development".