Friday, September 22, 2017

TIPS / Few ways to hold your iPad

I have been using the iPad since the original iPad. Back then iPad was more bulky and I hold it with both hands most of the time. I remember there was this Grabbit iPad Case back then allowing me to simply hold the iPad single handed, by slipping my hand into some kind of handle. 

These days, the iPad is very slim and light. Even large size iPad Pro can be hold with a single hand, but just be careful not to drop it on your head or teeth while using it laying down (!).

WAYS TO HOLD YOUR IPAD


I found these are 7 most used stance I use when holding the iPad with a single hand:

1. STRONG PINKY

2. UPSIDE DOWN SEVEN, good for laying down, but do not drop it on your head (!)

3. CONSERVATIVE 

4. OCD

5. THUMB and PINKY

6. SIDEWAYS SEVEN, this is more stable landscape

7. SIDEWAYS CONSERVATIVE

EXTRA TIPS:


  • If you are placing the iPad on the table and without its case, I found Gorilla Pod is the best holder for it.
  • You can certainly place it on lap or on flat floor
  • You can also hold it using feet, but not the most comfortable, a bit advanced technique

Monday, August 21, 2017

SWIFT / Procedural SceneKit Test for ARKit

I have been studying the ARKit in iOS 11 for almost 2 months now since it is being announced at WWDC 2017. It has been quite an interesting experience for me. I am learning a lot about the iOS app development and Swift, especially in area like SceneKit usage for ARKit.

Swift language itself is very fun and expressive to use, I found a lot of interesting things that Swift can do inside XCode to make interactive app and also inside Swift Playground on iPad.

SCENEKIT WITH SWIFT

So far, with ARKit, I manage to bring in some 3D assets and animation from Blender into XCode, to view as AR in 3D. That is pretty fun in itself, but I wanted to do more, perhaps some kind of procedural 3D scene and some basic interactions.

I am thinking of collecting and writing some Swift codes that probably can help to generate 3D SceneKit asset on the fly in this blog.

Apart from XCode, I am also using open source Blender 3D and nodes add-on like Sverchok and Animation Nodes to help me with algorithms and logics, which I can convert to Swift bits by bits to generate a cool procedural SceneKit setup.

I could imagine that before 2020, I think the iPhone and iPad perhaps will have some kind of Node  Based tools and system to play with 3D and maybe AR.

Obviously for basic layout 3D Scene, the SceneKit editor inside XCode would be sufficient, but sometimes coding the scene and procedurally laying out 3D objects would be handy after all.

I think AR world is very powerful when treated like game environment. Familiarity with SceneKit, and SpriteKit is really a big bonus if you want to transfer the knowledge as ARKit environment.

SWIFT: PROCEDURAL GRID OF SPHERES

This is pretty much basic and classic task, many examples of coding start this way. To generate bunch of 3D objects in a grid XYZ, we simply use some kind of for loop, iterating based on range of number we specify.



I am testing it out like this:

CODE:
    
    override func viewDidLoad() {
        super.viewDidLoad()
        
        // Set the view's delegate
        sceneView.delegate = self
        
        // Show statistics such as fps and timing information
        sceneView.showsStatistics = false
        
        // Create a new scene
        let scene = SCNScene(named: "art.scnassets/ship.scn")!
        
        makeSphere(scene: scene, rangeX: 10, rangeY: 20)

        // Set the scene to the view
        sceneView.scene = scene
    }



The function to generate a grid of Sphere would be something like below:

func makeSphere(scene: SCNScene, rangeX: Int, rangeY: Int) {
    for i in 0...rangeX {
        for j in 0...rangeY {

            let sphere = SCNSphere(radius: 0.03)
            let sphereNode = SCNNode(geometry: sphere)
            
            sphereNode.position = SCNVector3(Double(i) * 0.1, Double(j) * 0.1, 0)
            
            // Parent sphere node into the rootNode of SCENE
            scene.rootNode.addChildNode(sphereNode)
        }
    }

}

The code above is a quick and dirty code, where I really does everything in one go as the viewDidLoad() being called. Perhaps we can make the code for the app nicer, not calling everything at the beginning. That I will think later.

Also, dealing with many objects, would be nice to always keep track of the nodes generated and put it inside array list or dictionary of some sort. 

For now, I really want to get a hang of being able to just procedurally generate bunch of spheres grid.



INTERACTION IN APP

What interesting about ARKit app is that we can actually let USER to interact and experience the 3D and also give them the ability to spawn and generate the 3D objects, as much as we design and give them controls to do so when the app is running.

For the basic level SceneKit creations, using Swift, we should be able to easily:
- Generate some primitives 3D objects as provided by Apple XCode
- Assign Material, adjust the materials
- Place the 3D objects into the main SCN 
- Add Camera
- Add Light
- Add Particles
- Etc

This is a common task when working in 3D software, like Blender, Maya, Houdini. But a little bit more like game production, while giving more controls to the user when they actually experience the app. I found this quite interesting.

I supposed I shall continue collecting some knowledge around this area for AR experience.

Wednesday, July 26, 2017

JOURNEY / My First Sticker Pack app

I decided to write a little "Behind The Screen" story of my first iMessage Sticker Pack app made for iPhone and iPad, currently available at Apple App Store. Hopefully this would be helpful to inspire other artists, illustrator, animators, and creators thinking to make their own app.

Woofy Stickers

https://itunes.apple.com/au/app/woofy-stickers/id1234473192?mt=8


Just Like Writing A Book

Creating a collection of stickers is like writing a book. It is not actually a linear process.

You start by collecting a lot of photo references, and in my case, I do collect a lot of dogs photos. Then the next step is of course to draw a lot of doodles. Out of 1000 drawings over a long period of time, maybe you get roughly 10 good enough stickers. That is my quick estimation. But then again, we need to keep the similar style and combine some ideas for final product.

Especially with iMessage stickers, we need to think what kind of emotion or expression would like to be shared by a person sharing the sticker. I test this with some friends and my wife also.

Draw A Lot: Digital and Analog way combined

I used to bring my A5 drawing pads with me and draw using pen or pencil while inside the train or bus. These days, I am drawing using my iPad and sometimes my iPhone. I do not have to buy a new drawing pad each time and take photos for each little drawing. The process is a lot streamlined and smoother nowadays with iPad having built in camera and Internet capability.



With iPad, I am using this drawing app called Paper 53 app a lot. I think it is one of my favourite doodling tools, apart from Tasayui Sketches and also Pro Create.  But I think Paper 53 app is the best for me because I like to quickly put down my ideas and thought into drawings and Paper 53 is really fast and it also provides a set of very natural drawing tools.

I am currently using iPad Air and also Pencil 53 as you can see above, but finger drawing is still pretty handy in most cases. Sometimes I draw on sofa or even on bed, whenever I got a new drawing idea. I did get my iPad Pro quite recently and use the Apple Pencil a lot. I made my sticker pack app using iPad Pro and ArtRage app for paint brush look.

Whenever I finished drawings, I quickly send it as PDF or as PNG to my email as backup. Sometimes I posted them into Twitter or Instagram, if it has a bit of fun story to share.

Challenge Yourself

For me, the experience of creating the first app is a big deal, eventhough it is just a sticker app.

I learn to:

  • Use XCode 
  • Package the stickers
  • Design the app icon
  • Processing and preparing the product
  • Publishing the app

I always had this question: "I am not a programmer, but I always want to make some kind of app that other people can use and wondering if I can make one?" 

Luckily, since last WWDC 2016 - 2017, Apple made it really easy for artists and illustrators to make their own Sticker package for iMessage!

All we need to do is really to download XCode and use the iMessage Sticker Template. The minimum requirements:
- Have a MacBookPro (2010 or earlier)
- Have an iPhone
- Have a good Internet connection to get learning resources.

This video from Apple WWDC really helps me to make my sticker app:
https://developer.apple.com/videos/play/wwdc2016/204/

You can apparently have a free Apple Developer account simply by using your current Apple account. There is no need to pay for Apple Dev license until you decided to want to sell and publish your work at the App Store.

After learning some XCode and Swift, I found also that we can make so many simple apps you can install on your iPhone and iPad very easily. Such as Camera app, Calculator app, Notes app, and many more. Easier than you might have thought. Swift coding is apparently very friendly, similar to Python language.

What is Next?

My sticker app is probably not the top sticker apps out there, in fact far from successful project. But this is only the beginning.

I am currently learning a little bit more about Swift Programming and also how to make AR Augmented Reality app using Apple ARKit (introduced in WWDC 2017). This is really fun app! I think I should make my sticker a bit more animated.

Saturday, September 17, 2016

Friday, September 16, 2016

IPHONE 7 / First Impression

I am lucky enough to get the iPhone 7 Plus on the first day. I ordered it 5 minutes when it started selling online, and just in time.

I guess queue-ing at Apple Store is no longer prioritized, online reservation is the way to go. Pity because queue-ing is such an experience. However, you will still queue for a bit when you visit the store with reservation details :)

And so decided to make "first impression article". I am actually most interested with the iPhone Dual Camera upgrade and the overall new lens improvement.

Don't care too much about the color or the design similarity with iPhone 6. Haven't really thought too much about the missing jack and the dual speaker. It's certainly a lot louder, almost like the iPad Pro. It is also water resistant up to 1 meter, for 30 minutes. Battery and speed? Also improved! Eventhough I don't use Internet much today, but I took over 1000 photos and videos, and still have 20-30 percents left. Pretty sweet improvement overall. This is probably the most balanced iPhone and very mature in hardware and software.

WAS FASTER, EVEN FASTER!

Both my parents are using the iPhone 6S and my impression from using their camera sometimes is that the camera for 6S was better and sharper, great for 4K video as well. Although to be really honest, from 6 to 6S to 7, I just wish 6 camera was much better like 7 today. Especially aperture wise.

I understand the stabilizer and video is the big push for iPhone 6, although they market it to have a better camera, I think the 6's camera was okay overall. I remember taking 4K video of floating bees, it was gorgeous. Oh, not forgetting the LIVE PHOTOS in 6S. It is still in iPhone 7 and much faster.

iPhone 5 to 6 is always great for fast shooting, and I love the burst mode. Love the slomo.

THE IPHONE 7

Ok, now, switching to iPhone 7, I actually walk 6-7 hours today around Sydney with the iPhone 7 and sometimes taking photos using iPhone 6 and Fuji X100 for comparison.

The improvements from previous model is obvious:
- Daylight shooting is amazingly sharp, almost always looking perfect for scenery. High score for daylight and the color looks ultra natural. Best enjoyed using retina display with high gamut.


- HDR is almost always good looking and no more ghosting. The HDR feels natural color-wise.




- LIVE PHOTOS become really fast and no hickups during day shooting. Google Motion Stills app loves LIVE PHOTOS and I think it is worth taking photos with LIVE PHOTOS at all time!


You can check my Instagram @enzyme69 to see some of the Live Photos x Motion Stills Remix.

- It is still 12 MP only like iPhone 6S, but feels a lot sharper overall.


- Panorama swipe, even higher in resolution and a lot more stable. Faster in low light too.

Not original Panorama. Too big :D




What's NEW with iPhone 7 then?
- Night time shooting is a lot brighter, a lot less noisy. Obviously thanks to the bigger lens aperture. The noise occasionally visible, but I don't know sometimes it seems to try to switch to different lens and suddenly it looks good.

Darker better.




- We got a legit RAW photo saving ability in iPhone 7. I think currently it needs Apps in order to actually save in RAW, which is a pity. Maybe when they update the iOS 10, they will add RAW built in from stock Camera app. RAW is big and extra work though. Always keeps that in mind. I have not really explored much of the RAW, just because it is too dependent on apps (I use Manual, ProCam, haven't tested Lightroom Camera). Manual controls is fiddly sometimes with apps. It takes time to learn one app behaviour. I favor Camera+ from tap tap tap but it does not have RAW support yet.


- Of course not forgetting the DUAL CAMERA system. The optical zoom 1-2x from wide to telephoto is seamless. I don't know exactly how the system works, but they both works together and switching lens is not detectable. It smoothly zoom in and out. We can push to 10x digital zoom, and sometimes it looks perfectly fine for day time shooting. Just imagine "original iPhone" quality when the image is zoomed in (kind of low resolution looking), but still better. Actually when shooting video, resolution of zooming is higher. I mean, you can digital zoom and it still looks great.

Camera Zoom 1x (optical zoom)
Camera Zoom 2x (optical zoom)

Camera Zoom 10x (digital zoom) --- still looking great actually!
- The overall texture and color is brilliant. Maybe there is more to the 12 MP? I remember back then my old compact cameras did not even reach this level of crispiness.

iPhone 7 texture details is great.

FUJIX100 Snippet. Same shot as above.



- Portrait mode and bokeh algorithm --- are not available yet (!)  need to wait for iOS 10 update in few months time. They are still working on it. I have a feeling this iPhone 7 has some hidden tricks. So, I cannot comment or say much about this "bokeh" thing for portrait. I have not got models yet.

A bit of bokeh, but maybe the lighting condition etc and distance matters. I want more BOKEH! More intensity BOKEH!

Few more comparison iPhone 7 plus vs FujiX100 vs iPhone 6 plus

All taken using the default automatic, although I think each device can take a better shot via proper settings or apps (longer shutter, etc). Lens and sensor are also different slightly.

iPhone 7 zoom x1

iPhone 7 zoom x2
Fuji X100
For daytime shooting, iPhone 7 and Fuji X100 are actually a match, or iPhone 7 better? Fuji X100 is a lot older though. For night time however, Fuji can take nice shot, but iPhone stock Camera app cannot. HOWEVER, with apps (slow shutter control) and proper RAW editing, iPhone 7 actually can.
This is iPhone 7 I probably took a bad shot. Could be better, but this is just point and shot.
iPhone 6 Plus.
When comparing iPhone 6 and 7 Plus, night time shot results are different enough, although sometimes iPhone 7 can still take bad photos :) Hard to explain. Maybe the exposure or the noise reduction is badly reducing the quality. Or the auto ISO? Maybe I will have to test some apps and iPhone 7? 

HOW IPHONE 7 COULD BE BETTER?

- The sensor could still improve until it matches camera like FujiX100 (maybe closer each iteration of iPhone). Yes I really wish that one day iPhone camera sensor reaches this level. It will be in few years. However, on the other hand, there are many features via Apps that make the iPhone overall "photography" multiplied in term of capability.



- The control of ISO, aperture, etc. is still not in the stock Camera. It needs a simple control that works. Maybe iPhone needs simpler method to adjust those triangle of exposure. To be honest, with iPhone, unless it can do a realtime result preview like Lightspeed app or Camera+ super ultra low ISO mode, then we don't really have time to manually control those stuffs. No. Personally I think smartphone camera is the best modern street camera for many reasons. Well, not comparing with Leica or Fuji camera, but the iPhone stands on its own. It's different. iPhone is instant with a lot more.
Shooting up sometimes awkward. Noise is pretty bard here, because ISO.

- The ergonomic of iPhone. Now this is a tricky one. The design of iPhone is super simple, almost no way to hold it properly, slippery even, but of course we can eventually find a proper way to hold it, but still fragile. Actually a heavier case with handle can help, but that's an additional thing. I am using the expensive Apple "leather" case, a bit slippery still. Anyhow, with iPhone, if you use it as photography camera for a while, you should know that the shutter sometimes is best triggered from the volume rocker (not the screen), and that's at the left hand thumb. Sometimes you want to do that. Another way is to use Peek and Shoot single-handedly.



- There is this weird thing with iPhone 7. The screen is too fast, too responsive, and thus easy to take accidental photos. You will understand what I mean if you try the iPhone 7 yourself. Those extra blurry shots after taking photos. 1% of the time.


IPHONE 7 AND BEYOND

There you go, my first overall impression of iPhone 7. So far so good, the iPhone I always wanted. From 4 - 5 - 5S (great) - 6 Plus and now 7 plus, it's a great evolution unlike others. Every iteration and cycle is an improvements. Wonder how the next iPhone is going to be? Maybe they will focus more on the "deep learning" and "deep 3D" for augmented reality? But what's "photography" on iPhone is going to be? Lytro like? We will see!

Monday, July 18, 2016

Exploring Deep Neural Style with Pikazo app

Every now and then, there is an iOS APP that seems to be really unique and way ahead of its time. I wanted to deeply talk and write about PIKAZO app by Pikazo Inc.

Pikazo 2 Logo

This app allows users to mix and remix ART STYLE and SUBJECT of their own (photographs, sketches, etc) and generate image graphic output that is worth further studying or perhaps as a new original Art pieces especially when printed out into canvas or other products.

The resulting image will be as if the original Artist is painting and recreating for your photo.

You might have heard of recent popular app like Prisma that supposedly can turn photos into stylized graphics? Now, this PIKAZO app does a LOT more and more true to the nature of style transfer.

Keep in mind that this is not the usual sketch filters that we already have for years via Photoshop. With Neural Style Transfer, the machine actually does quite a hard job in separating elements and then re-assemble back into the original photo. At least that how I think of this process.
 
Below is just a quick examples of my creation using the Pikazo app:

iPhone Selfie (2016)

My Wife (2016)

Pretty impressive right? I think so! Let's see some more, but beforehand, I want to give a bit of background story on "neural art transfer" from "deep dream".

STORY OF DEEP STYLE TRANSFER BEFORE PIKAZO APP

My interest in this "Art Style Transfer" actually started a few months ago when I got to an interesting exploration area called machine learning and neural network and deep dream (that I stumbled by accident while I was studying and researching about Python and Jupyter iPython Notebook):  

GOOGLE DEEP DREAM DEMO
https://github.com/google/deepdream

Machine learning is a branch of computer science that try to take data, analyze data and find patterns do this automatically. Potential is huge, including voice recognition, image recognition, etc.

Like most curious geeks, I actually tried to setup my own "Deep Dream" using my own home computer. And I did it, with lots of hassles. Took me hours and eventually days of researching and trying to install all sort of Python repositories and it eventually works.

The setup is long and to get result is also takes a while.

Deep dream algorithm itself is a nice graphics demo displaying computer machine ability to draw photos using collection of photos.

Candi Borobudur by Philip Lesmana photo, redrawn using Temples

Cat on Pot (1998) photo, redrawn using Dog Photos
Google Deep Dream demo is fascinating in itself. This was show around about a year ago in 2015.

Now... from deep dream to "neural style transfer".

The next step in deep dream is neural style transfer, that is less bizarre, but more aesthetically beautiful.

Parisian Ballerina, photo by Jimmy Gunawan. Made in Pikazo.


The process of making "art style transfer" is actually slow and painful. Especially if the machine is using CPU only. Maybe 512 pixel max dimension will take around 30+ hours. Yes, really that slow and painful. Pikazo app does it in under 5 minutes only!

- Deep Dream GPU, can be fast depends on your machine
- Neural Style CPU slow, GPU is really hard to setup
- Neural Style using Pikazo app and Pikazo cloud computing => 5 minutes for 500 x 500 pixel.

I actually found one of Neural Style Transfer repository from GitHub that allows me to create something like below:

Photo is from Emrata / Emily Ratajkowski Instagram photo, and I use anime style graphic to transfer into.

I found that the neural style to be totally amazing research and unusual. In a way, this is a very quick way to see "what result we get when we transfer art style with a photo".

Maybe this can be an ART IN PROGRESS thing? Not a final result? Perhaps it can help people to study art style? And can be more appreciative of the art, when applied to own SUBJECT / CONTENT? This is what I continue to test using the Pikazo app.

A photo posted by Jimmy Gunawan (@enzyme69) on

WARNING: ART REMIX AND COPYRIGHT ISSUE (!)

Since I started this "neural style transfer" exploration, I am completely aware and be really careful with copyright issue. Especially when remixing CONTENT and ART STYLE.


Made in Pikazo.

One cannot just mix a photo taken by others and Art Style painted by artists and mix the two and make his/her own.

So, I tried as much to use my own CONTENT / SUBJECT creation, and remixing it with FOUND ART STYLE that I can give credit and attribution.


Photography by Willy Gunawan, my brother. Style: Unknown. Made in Pikazo.

There are limitless ART STYLE to explore out there from Kandinsky until recent modern artists. I definitely appreciate ART STYLE nowadays after Pikazo.

For my own art daily project, I am using old and new photos of mine, recycling years of photos I collected in Flickr and now I use Pikazo to bring it back to life.

If you are feeling adventurous, you are not necessarily need to mix Photo with Style of Art, you could easily mix your own Painting Styles, or Handrawing or perhaps Textures into calculation. You will get some interesting result!

Japanese Girl 05, Made in Pikazo.

Japanese Girl 14, Made in Pikazo.


DISCOVERING PIKAZO and PIKAZO 2 




Pikazo app was originally created by Karl Stiefvater, just a few months ago. I found Pikazo probably by lucky accident. It is a simple app that can transfer STYLE into CONTENT, mix the two and the result is a 500 x 500 pixel dimensions of art. It actually does EVERYTHING that I have tried to do when exploring all kinds of method to do Neural Style transfer using my own computer, and a lot simpler too, thanks to smartphone capability to do it on the fly (I use iPhone).

There is of course a sense of satisfaction having able to make the Neural Style setup using own machine. Before later on finding out some smarter people out there are doing the same and create the tool that does more and I really appraise Pikazo Inc. for this!

So, this is the next step my exploration in Neural Art Style Transfer and I tried using an old photo of mine of this black dog at Crown Street.

I actually test it on hundred of styles using Pikazo app. It gives me 500 x 500 pixel output.

"Black Dog" (2016) Made in Pikazo.
Before long, Pikazo 2 app actually came to surface. Pikazo 2 is a more refined version of the app, allowing users to do more like reading MUSE area for article and inspiration of style. And also users can join the PIKAZO SALON Facebook Group, where other Pikazo artists shares their artworks and talks and comments.

Pikazo 2 by default allows you to draft and create 800 x 800 pixel art works. Don't let the dimension trick you, 800 pixel is actually quite high resolution for sharing.

Portrait Photo and Model: Ricky Adrian / Kie Photography. Made in Pikazo..

If you like to go further on your own artworks, you can apparently buy Jetons and output up to 3200 x 3200 pixels dimension. Large enough to output and print into Canvas or others like T-Shirt or Mugs or whatever design items. Maybe even iPhone case? Yes, why not :)

PIKAZO TIPS



For beginner Pikazo users, I can give you some tips:
  • Collect all kind style and photos and be experimental
  • Try using close up PORTRAIT or perhaps SELFIE, often time it gives you nice result. As silly as selfie may seem, actually with Pikazo, even the silliest face looks awesome.
  • Pikazo Style works best for the most ABSTRACT STYLE, but try all kind of ART STYLE with interesting paint stroke details, and be surprised!
  • Explore all kind of Art Style
  • Use high resolution photo and art if you can
  • Find more inspirations at Pikazo Salon


Black Dog XIV. Made in Pikazo.

OTHER STYLE TRANSFER APP AND PLUGIN

Below is a Deep Style Transfer result using Dwango Python Chainer Plugin for OpenToonz. I tested this using my own machine at home, and took me a few hours to develop and get result. It is very slow on CPU, perhaps faster using GPU, but not easy to setup.

The Prisma app actually does this kind of style very similarly. So I guess it is probably using the same quick and shortcut technique implemented.



Overall, the neural style result you like is back to own personal taste. Prisma app is interesting and fast, it gives effect that is more like a Photoshop filter.

I think I much prefer to use Pikazo for more REAL and sophisticated result. Art form should continue to evolve, analog or digital, or anything. Re-discover old and new styles. Remix it. Appreciate it.

ONE MORE THING...

 I remixed 400 results of "black dog" that I have collected on the same subject, below is "The Black Dog" superimposed over one another. It becomes almost like a real painting:

Result of mixing 400 styles. Almost like original photo, but it feels like real art.

What is next? Probably Neural Art Style applied into movie or animation? We shall see!

"The Gap" (2016), Sydney, Made in Pikazo.


SEE ALSO:
"Loving Vincent" (2016) Trailer


Thanks to Pikazo app this "neural style transfer" becoming an accessible area is for everyone to explore! Using just own smartphone and some bits of creativity, we can create a "NEW" and interesting art result. Have fun and enjoy!

A photo posted by Jimmy Gunawan (@enzyme69) on


Famous photo of Dalai Lama, using my own Instant Noodles as style.