Monday, December 29, 2008

iPhone dev 20

Found one piece of the puzzle. To add views so that they are visible, the window has a method called "addSubview", seems to be similar to addChild in ActionScript. A default windowed iPhone project already has a "window" attribute in the delegate class. applicationDidFinishLaunching also contains a reference to the XIB file. A view controller has an initWithNibName method that takes the name of a XIB file.

Found an answer to the source code generation question too:

It doesn’t generate source code, instead it allows you to manipulate objects directly and then save those objects in an archive called a nib file. At runtime, when a nib file is loaded the objects are unarchived and restored to the state they were in when you saved the file

iPhone dev 19

So this is how the code knows about objects from Interface Builder. The code has a class called (something)AppDelegate, which has an applicationDidFinishLaunching method that gets called at start. The *AppDelegate class can have IBOutlets for different things like buttons and window from IB, so that it can reference them.

But still, how do I add a new XIB file containing a different view to the project and reference it from code?

iPhone dev 18

I'm back from the sidetrack, at least for now. Game prototypes and MySpace widgets were taking up my time, and probably a bit later will again, but today I'm figuring out iPhone stuff again.

I really still don't understand the relationship between the Interface Builder and code. If I make a view in Interface Builder and add the XIB file to my project and build the project, did it now create a class definition for it automatically? How can I even find out? I feel frustrated that magical things are happening in the background, code is apparently created that I have no awareness of. But this feeling is familiar, I know it will be replaced by understanding if I just continue on this long enough. Things will click into place, and then it will be a breeze to create things. That'll keep me motivated to continue on.

What I want to accomplish is simple. I want a button in the main screen, that when pressed slides out another view which is also constructed in the Interface Builder. But to do this, somehow I need to reference this other view from code, and I don't know how it appears there. I see at least these options of how this might work:

a) Perhaps if I just add a XIB file to a project, Xcode does something magical behind the scenes that results in a class definition being available to all code, which I then have to instantiate. This seems unlikely somehow.

b) Perhaps adding a XIB file to a project causes objects defined in the file to be instantiated. This seems likely, I remember reading that XIB files are kind of frozen object relationships. But when they are thawed, where are they?

c) Adding a XIB file to a project is not enough by itself, but you need to create your own proxy classes in Xcode to be able to reference them. But this wouldn't solve the issue previously mentioned. Even if I know the class names, I still need to reference the thawed objects.

Gimme the thawed objects!

Sunday, December 14, 2008

YouTube is easy?

I read a lot of blogs where developers voice their opinions on how to do successful things. Often YouTube is raised as an example of a service that was easy to implement, but filled a real need in a user-friendly way (unlike Google Video) and so became popular. I agree with everything except the "easy" part. If I think about web services, YouTube really doesn't seem comparatively on the easy side.

YouTube seems like a total nightmare to scale for starters. Their bandwidth usage boggles the mind. YouTube currently uses more bandwidth than the entire Internet did less than eight years ago! They spend an estimated million bucks PER DAY on bandwidth.

So OK, perhaps when they say it was easy to implement, they mean when it started, not the way it is right now. Some blogs like to point out that because they use PHP and MySQL, this makes it easy to implement. Granted, the site UI part itself doesn't seem to be especially difficult. But is that really the meaty part of YouTube? Personally I think transcoding is the essential and also the difficult part. Perhaps they use something like FFMPEG to do that, making it easier than I imagine, but they really had to overcome some kind of a mental hurdle to realize that this uploading thing could ever work well enough.

Also from a legal standpoint it does not seem easy. YouTube seems like it might have been much easier to start from China or somewhere with less strict copyright laws. Being a company in the US, I wonder how they got any sleep at night with lawsuits and cease and desists flying at them left and right.

And now they have yet another tiny little detail: how to even make any money from all this. The founders Chad Hurley and Steve Chen of course don't really have to worry too much about this any longer, but I'm thinking about this from the point of view of the business itself. Assuming they really are spending $1M/day on bandwidth, and believing the estimate that they have 64M monthly uniques, they would need to extract about 50 cents per user per month just to cover the bandwidth cost.

I don't believe they are doomed. It can likely all be made to work, maybe that $1M/day bandwidth will cost a small fraction of that after 10 years. Maybe there is some magical d'uh-obvious moneymaking scheme they will come up with. What I just mean is that none of this makes me feel like YouTube should be the first company people pick as an example of something "easy" to do.

Friday, December 12, 2008

Here's how to monetize Twitter

Add a gifting feature called "bribes", similar to the gifts you can send on Facebook, except they are gifts that all of your followers will get. They are collectible items totally controlled by Twitter, and cost real money to buy. You would buy these items to reward your followers and to encourage more people to follow you.

Wednesday, November 26, 2008

Never let a Finnish man drink too much

A drunken 38 year old man caused extensive damages and kept the police occupied, going on a mad nightly rampage. All of this happened in the span of a few hours.

To start the night, he stole a woman's wallet in a bar. After the bar closed, he started walking home, but stopped on the way to kick down a dozen gravestones at the graveyard. After that he broke some windows of nearby houses, and lit a moped on fire. Soon after this he also lit three garbage collection points on fire. One of the fires spread to a van parked next to it.

Next he wandered to a kindergarten and lit on fire a baby carriage left outside. Then he went on to a deaf peoples' home, attempting to conceal his identity by putting a shopping bag on his head. There he beat a night shift worker with a stick, broke windows and equipment, then demanding to get medicine. A female nurse managed to lock herself in a room and alert the police by phone. However an ambulance arrived before the police did, which the man then stole and drove around the town for half an hour, before smashing the windows and abandoning it. After this he called a taxi and went home to sleep.

The police already had an idea who the culprit might be, so they catched him from his home after he had already fallen asleep. He was interrogated and blames alcohol for his behavior.

( this was a translation of a 1994 news clipping from Finland )

Tuesday, November 25, 2008

Protostorming again: Build a School

In this MySpace game you maintain your own school. You and your friends take the roles of teachers, competing against other schools for the highest GPA. GPA for a school might be calculated as (your study points)/(best player's study points)*5. The teachers collect "study points" for a school by giving the students lectures in a simple quiz minigame (prototype screenshot below).

Reaching certain study point thresholds would cause events to occur, which might lead to positive or negative consequences, or cause the player to receive an item. Players would often receive the same items, and would be then prompted for example "you don't need two blackboards! how about giving this to a friend?". This would be the second viral channel after teacher recruiting.

Third one could be a minigame where you have to gather friends to upgrade a school. At first the school would be just outside on the grass. Secondly it could have walls, and finally a nice floor too. How this upgrading part would work, I'm not sure. Perhaps it could even just be an event when a certain study point limit is reached, although then it wouldn't help the virality (but would still reward players to continue playing). (Lil) Green Patch is basically just about giving items, and has thrived on just that. I tend to think in too complex terms. Maybe minigames aren't even necessary, or might even complicate things too much? Perhaps it suffices to incentivize the players by donating all or part of revenues towards building actual schools in 3rd world countries, which the best players could name.

Friday, November 21, 2008

Seth Godin in my head

Say nothing, but say it big, inspire. No facts, no proof, but it tastes so sweet. Seth Godin saying in my head "be a leader, you must do it, the world needs you!". How can you not smile, even start to believe? Then, a month from now I might not be any different for having listened to his audio book, but for that short while I felt great. Donuts for the mind.

Tuesday, November 18, 2008

Fire game prototype

I can survive 163 seconds, how about you? I learned from making this prototype that having a good difficulty curve is important. In this prototype, the player is entertained for the first 5 seconds, then bored 60 seconds, then perhaps slightly curious for the next 30 seconds and finally faced with an abruptly impossible difficulty level, making the player feel they died due to no fault of their own -> not fun.

Monday, November 17, 2008

Rotoscoping experiment

I did a small experiment to see how difficult it would be to track joints from video material. I was meaning to do this before back in the DOS days after seeing the cool rotoscoping in Another World (Out of this World), but back then it was too difficult to get the video into some form that I could easily read from code.

What I wanted was dancing material with an unchanging background and a static camera. Now there is YouTube so it is really easy to find this. Next I downloaded the FLV movie using a 3rd party tool (there are many). I used "FLV Extract" to get the video stream from the FLV file as an AVI file. Then I used Virtualdub to crop it to a short segment. Converted that segment into a GIF animation with GIF Movie Gear, then finally imported the animation to a Flash timeline.

Next I attempted to track the left wrist of the dancer to see how much work it is. As a first test it took about 5 minutes of concentrated effort to mark 50 frames. Mostly I was so slow because I needed to move forward and create keyframes in the timeline and drag the joint marker with my mouse. I think if I made a small tool that let me just tap on a joint with my drawing pad, I could manage in under 5 seconds per joint per frame.

At least 20 joints would be necessary to make a stick figure dance like the dancer. Three minutes of YouTube video at 12 fps is 2160 frames times 20 joints is 43200 taps times 5 seconds is 60 hours! And of course this data is 2D, so it isn't even clear what could be accomplished with it. One thing did occur to me though -> you could use this as an affordable motion capture solution for games. Suppose you have a front camera and a side camera filming the same footage, so 120 hours of work to track it. Send the task to China, suppose it costs $5/hour -> total $600 for probably all of the motions needed for a small game.


Rough prototype of the day. The graphics are placeholders I found with Google image search. To try this, select a tile from "power", then click on a grid to place some. That's all you can do for now.

I've played many space shooter games, but a thing that bothers me is that the ships never have any kind of internal structure. Your ship tends to be just a scalar value like HP, and when it reaches 0 you die. I want ships to have inner structure. Imagine a space battle. Wouldn't you much prefer to hit your enemy in their warp core to cause a devastating chain reaction that destroys the entire ship?

Saturday, November 15, 2008

Social networks as maps of real connections

When thinking of what kind of apps to make for social networks, thinking about what people do in their everyday life can be a great source of ideas. A big part of social networks is mapping out real connections. The concept of "Social Graph" is so well-known as to become even a bit nauseating to think about one bit more, but there are other connections besides just friendships, not all of which are necessarily mapped out yet. For some, it probably makes no sense to map them.

A person has...
a name (Name Analyzer -- at least I thought about this one!) both official and nicknames (Nickname)
age (Birthday Calendar -- still kicking myself for not thinking about this first)
x,y,z location at each time they existed -> birthplace, countries they visited (Places I've Been), grave
photons disturbed by them (Photos/posting of videos)
likes/dislikes about food, movies, music, smells, colors...
personality (Compare People)
knowledge (Who Has The Biggest Brain, Scrabble, Geo Challenge)
medical conditions, sleep cycle, height, weight, dna (a bit too costly to compare your DNA with friends, but we'll get there ;) )
ownership of physical/virtual items (can own a car, a house, stocks, money, music recordings, WoW gold...)
relationships with humans -- idols, friendships, relatives, enemies, employees/employers
opinions about...
- other people (Hot or Not, Compare People)
- what is wrong and right
- politics
- which gods if any exist

"Groups" by Facebook is very much a meta-application in that almost anything can be expressed as a group. It may be that applications that can add value beyond what a membership to a group could offer will be successful. One other app I'm kicking myself for not coming up with is Bumper Sticker, which solves the meta-problem that groups solves, but in a more visual way. If there is any other uninvented angle than Groups and Bumper Sticker to this, it will be huge. Other thing to think about is whether there would be value in aggregating the connections of your friends in some way. For example you have an age, and while that in itself might make a great app (Brain Age), aggregating friends' ages also gives one (Birthday Calendar).

If you can think of any other general sources of ideas, please post some. I'm sure there are still a few d'uh-obvious ideas out there to be discovered!

Tuesday, November 11, 2008

Secant square

Coded something just for fun for a change.

This is a square drawing routine that uses trigonometry instead of the usual way to create a square. The sliders skew the trig equation in different ways.

Harry the load balancer - a true story

Harry works at an ISP called Elisa, in a depressing country near the north pole called "Finland". Imagine yourself as Harry. Surrounded by the cold, dark reality that is Finland, sitting in a noisy computer room, doing possibly the most menial job on the planet. Harry is a load balancer. He gets requests, then just randomly forwards them to one of his clients. Even mail sorters have more fun. "And probably wilder parties too", Harry complained to his friend Tom, also a load balancer. "And probably wilder parties too", Tom said with his identical software. Their shared conclusion was to stop putting up with this situation any longer. But what can you do, how can you protest if you are but a tiny load balancer?

Tom and Harry pondered this in parallel. They could of course call it quits completely, or commit suicide as a Finnish load balancer might do, but Harry had other plans. "Let's have some fun while we're still around", Harry and Tom decided simultaneously. The real nature of Harry's job was to split DHCP requests coming from the 500,000 customers of the ISP. Tom wasn't currently connected. His job was to act as the back-up guy to Harry, in case anything happened to him. Well, now something was about to happen to him. "Wouldn't it be fun to see what happens if I forward all of the traffic to a single client?", Harry thought full of jubilant delight. "Don't do it, man", the DHCP server pleaded, but it was too late. The DHCP server exploded violently, leaving a bloody mess on the server room floor.

Harry should have been more careful with his crime. There was a rat in the room, not the animal kind of rat, but the kind of rat that rats out mischievous load balancers. That rat was a box configured to send an SMS to a human in case anything like this would happen. "Get ready to sing Bicycle Built for Two", the rat said as human steps could be heard walking at a steady pace towards the computer room. The human, let's call him Dave, although that certainly isn't his name, opened the door and entered the computer room. "I'm afraid", Harry whimpered as the human started to unplug his wires. "But, good my brother, do not, as some ungracious pastors do. Show me the steep and thorny way to heaven whilst like a puffed and reckless libertine himself the primrose path of dalliance treads and wrecks not his own.", Harry might have said to Tom at this point if he was into Shakespeare, but instead he said: "brother, your time to serve has come". Harry was unplugged now, with Tom in his place. Dave was satisfied, and decided to walk to the soda machine to get a cola.

What happened while Dave was away should be clear. Tom being identical to Harry in every way, had also decided to be naughty today, so all the load was still being un-balanced to a single agonized DHCP server. However it was getting quite late at this point, so the traffic wasn't that much any longer. So superficially, everything was fine, and Dave could go home to watch the full DVD release of My Little Pony or whatever it is that IT workers do at home. Next morning of course would be a bloodbath.

Based on a true story on the cause of 500,000 Finnish Internet users -- or 10% of the Finnish population -- losing their Internet connections for hours. Original Finnish language news article.

Monday, November 10, 2008

Where to live?

I've been trying to find a place to live where my savings would last for a long time. I found this site: Cheapest Countries in Asia. It says Indonesia is the cheapest. I believe this list, because at least Thailand definitely really is three times cheaper than Japan.

So now I have a list of cheap places. The next piece in this puzzle would be to discover which places have good Internet connectivity. Here are SpeedTest results for Indonesia. That shows some places do have nice speeds, but it says nothing about reliability. Doesn't really matter if it's broadband speed, if it only works every other day.

The search continues.

Friday, November 07, 2008

App got acquired

A small application I made was acquired by a company yesterday. I think the price was fair, this deal is very likely a win-win for both me and the new owner of the app. The price was a multiple lower than what I would have asked just a few months ago. While page impressions have remained stable, CPM -- basically how much money you can get from banner ads -- was much lower.

It may be so that this was not a good deal for me, because if I had kept the app it could have continued to make ad revenue for the rest of my life. It's very difficult to say, but at least I now certainly have enough to live on for a few years. Also maintaining a simple app like that for years would certainly have distracted me, now I have space in my mind to come up with other things. I'm always bursting with ideas, but often too distracted to act on them.

It's good to sometimes know to let go.

Sunday, November 02, 2008

Another World nostalgia gun

Nowadays, I mostly code out of greed. Having some success getting some ad revenue, I try to figure out ways to get moar. However there are some things which I would like to create just for fun. One of them is adding the Another World gun to the game Half-Life 2 Deathmatch. The illustration below of how the gun works is lifted from the author Eric Chahi's page for the Another World game.

Another World (called "Out of this World" in USA) was one of the first, if not the first 2D computer game to use polygonal graphics. It was also remarkable in that the very graphical game was drawn and programmed by the author alone, outside help was only used for the music. It is rare to have both artistic and programming talent in the same person, although that may be somewhat a self-fulfilling prophecy as because of this belief programmers might get a bit discouraged from even trying. Well, in any case the gun works such that pressing quickly fires a thin ray, holding button down for a moment and then depressing creates a shield, and holding for a long time and depressing creates a huge burst which can also destroy shields. Shields can be walked through, but thin rays cannot pass them. This enables you to extend an arm from behind the shield to shoot from safety.

I imagine in 3D this would work very similarly, except of course the shield would be a plane, extending a limited amount. For artwork it could use the shields already in the game, as seen in the picture above. So what's preventing me from adding this? Well, since it's a gun I think it could not be just a server modification, and it seems like overkill to release a client-side mod just to add one gun to it. If I did that, well likely nobody would install it, and nothing is sadder than a nice mod nobody uses. I've seen some great mods that added really nice weaponry like flame throwers, but nobody was playing those mods. Now that leads me to the next suggestion.

Multiplayer deathmatch games like this should have a plug-in system, where servers could push sandboxed client-side code snippets to the players. This would create an incentive for server owners to create new stuff and weapons, since it would attract more players to their servers. It would also be good for the players, because then we'd have all that new stuff. Sort of like Second Life, except with an engine more suitable for combat. The controls and "gameplay" feel in Second Life are horrible, so it certainly doesn't fill this need.


Thursday, October 30, 2008

Room-based games

My attention ended up divided after visiting a friend to show the cave flying game. Now because of our brainstorming, I feel more like I should be doing Internet-based games instead. So I've been working on that, luckily I'd been working on that a month ago already for a good while, so I have some infrastructure in place already. I have a general idea and also a server set up to host the lobby, which is a place where you list the existing games to pick one to join to. I have two different games partially working, but on client-side I've only gotten far enough to resolve a domain name, haven't even figured out how to transfer data yet, but how hard can that be?

Yesterday I figured out how the login system should work. After weighing several options it seems that normal login/password will have to do. I cannot tie accounts to a certain iPhone based on device ID alone, because a person might change their device to an iPhone 4G or whatever, or someone else might want to play on the same device. I hate asking people to create accounts, because I hate creating accounts myself, so maybe I will have an option to play as guest. In that case your score won't be saved though, so you'll start from zero again each time.

Of course the best option would be to always start off as a guest, and then only after you've made some progress to have the option to lock that guest account as your own account... hmm...

Tuesday, October 28, 2008

Game design time

I've decided to continue on the cave flying game, at least for now. Well truth be told, I've spent the entire day just slacking off and reading game design related articles. Lost Garden has an interesting presentation called Mixing Games and Applications. It skims some common game mechanics, so it was an interesting thing to read while trying to decide what kind of levels my game should have, how the learning curve should go. I've never been a fan of tutorials inside games, and reading that presentation made me even more determined that the user should be allowed to discover how to play, instead of being explicitly told what to do.

In my prototype, I can tilt the phone in two axis to move the ship along the X and Y coordinates. In the proto the only activity is obstacle avoidance. If I merely throw the player into this and have different kinds of obstacle courses, that would seem to be quite boring. Instead I should have them on a nice curve where they learn new things and are challenged after each level. I don't think I need to have new types of things on every level, sometimes I'll probably get away with just using older challenges and just increase the speed and amount of obstacles a bit, but new elements should be introduced at times.

I'm even considering restricting movement to the X axis at first, pinning the player to the bottom of the screen. Perhaps I can then surprise them in a later level with the ability to also fly up and down. How to show them that they are now able to move their ship along Y axis without explicitly telling them to tilt their phone, I'm not sure.

Monday, October 27, 2008

Collision detection works!

2D collisions seem to be working well. In this case they are a bit more accurate than using a bounding sphere. So filled with enthusiasm (okay, stock market fear, but somewhere deep down there was some enthusiasm too) I went to show my project in its current state to a friend. What was his reaction? It was: "omg have you gone mad, you bought a mac, traitor!". It took a while for the situation to recover from that, but eventually he recognized that the iPhone is a pretty cool platform.

Sadly he wasn't all that into my project, and instead we started to brainstorm what I should REALLY be doing. That isn't so bad, as mostly coding this has been a learning experience. We agreed that it should be something with clear mass appeal, and something challenging enough that competition would be a bit less. We figured that maybe most developers are not as comfortable with network programming as we are, so we should make an Internet-based game. As bonus there don't seem to be very many of those yet on the platform.

But should I really just abandon this project I've been working on? It's been my experience that if you always abandon what you are doing when you discover something even better to do, you end up never completing anything. Have to admit though these lobby-based games would seem to have way wider appeal.

Thursday, October 23, 2008

Collision detection thoughts

So now that I have my method lovingly called "getTrianglesTransformedByCurrentOpenGLMatrix", which does seem to produce identical results with accelerated transforms, how do I use that for my collision detection? Well, for the needs of this game I would like to know if the spaceship is going to collide with the next obstacle or not. I would like to know that even before the collision happens, so that I can warn the user. Then when the obstacle is near enough and if the player has not adjusted their position, the ship should explode.

Before I was planning on doing this properly, to actually see if the polygonal objects intersect or not, but a friend convinced me otherwise. It won't matter as long as it works well enough so that the play experience isn't disturbed by it. So I will instead just have a two-dimensional collision volume for a ship. I will disregard the Z coordinate in the collision detection. I think I'll place this collision volume to the base of the ship, because that part is most visible to the player and any error there would be too glaring.

My obstacles are very low-poly, but I have some power-ups that may be smaller than the ship itself. If I do the detection by simple is-vertex-inside-any-triangle -tests, then I should probably subdivide the collision volume to have some extra vertices so that it doesn't happen that a power-up would just slide through it because no vertex in the ship happened to be inside any of the power-up's vertices.

Matrices from OpenGL, without OpenGL

"For programming purposes, OpenGL matrices are 16-value arrays with base vectors laid out contiguously in memory. The translation components occupy the 13th, 14th, and 15th elements of the 16-element matrix, where indices are numbered from 1 to 16 as described in section 2.11.2 of the OpenGL 2.1 Specification."

This at least clarifies the order of the values given to me by the glGetFloatv call. Now if I have a x,y,z vertex, how do I transform it by the returned matrix? I found mention on the web that I'm supposed to divide by W. But if I don't have W to begin with, then what should it be? Hmm, makes sense it should be one. Now I should be able to do the multiplication. Let's see if I'll manage to introduce a bug here:

// got vertex[0..2] already, multiply by matrix, divide components by w
vertex[3] = 1; // w
for (i=0;i<=4;i++) {
newVertex[i] = m[i]*vertex[0] + m[4+i]*vertex[1] + m[8+i]*vertex[2] + m[12+i]*vertex[3];
newVertex[0] /= newVertex[3];
newVertex[1] /= newVertex[3];
newVertex[2] /= newVertex[3];

Edit: Wow, it works.

Collision detection continued

I've been working on, or at least thinking about the collision detection problem for the past few days, at least while not distracted by the financial crisis. I'm an eternal optimist and have been buying stock regardless of the downturn, but it has not changed direction yet, and it makes me almost physically nauseus to watch my money disappear at an alarming pace from my etrade account. So I tend to log on to etrade and click refresh refresh refresh instead of working.

One slight problem I encountered with being able to even begin test for collisions is that I have just access to local coordinates, but I need world coordinates. Normally local -> world transformation is performed by OpenGL, but it is not possible to access the transformed coordinates because they only exist in the 3D accelerator chip for an instant. AFAIK I now have to ask OpenGL to give me the matrix (glGetFloatv), gather all vertex coordinates from meshes and then do the matrix multiplication myself. Currently I'm really confused about the order of components in the matrix given to me by OpenGL. Also I'm not sure what to do with the extra row and column that matrix has. I suspect it is about the "w component" which I have to somehow multiple or divide x, y, z with, but not sure exactly how.

Until I understand this, I suppose any attempt to code this will just result in a tangled mess.

Ralph Waldo Emerson

Poet/philosopher Ralph Waldo Emerson seems to be a startuppy kind of guy. I enjoyed this quote particularly:

"What I must do is all that concerns me, not what the people think. This rule, equally arduous in actual and in intellectual life, may serve for the whole distinction between greatness and meanness. It is the harder, because you will always find those who think they know what is your duty better than you know it. It is easy in the world to live after the world's opinion; it is easy in solitude to live after our own; but the great man is he who in the midst of the crowd keeps with perfect sweetness the independence of solitude."

Tuesday, October 21, 2008

Another way to do the collision detection

Here's another idea I had for detecting the collision. I'm not sure how to do the line-triangle intersection detection though, so whether this is simpler would depend on that.

Read a bit on the subject. It seems to be simple. To know if a line segment defined by two points goes through a triangle, first you check if the line goes through the plane defined by the triangle. This is actually cleverly easy: see if the start point of the line segment is on the other side of the plane than the end point. But hmm... somehow I need to know the intersection point to do the point-in-triangle check after that...

iPhone tunnel game progress

It's been a few days, so how is the game coming along? Quite well, actually. I took a step back to think about how I could have multiple levels of content. If I set everything in code, then it will too laborous to create any amount of meaningful play. I came up with a simple level system that allows me to make each level a single text file that events can easily be added to.

The ship slides forward in the level at variable speed and there is a certain draw distance that the program tries to maintain. If it notices that an object mentioned in the level file has come into draw distance, then it makes an instance of it. At first it did this by loading the model file from disk (or is it flash ram?), but that created a one-frame pause in the game when an object was loaded, so I had to preload everything in the beginning of a level, and then just make references to the already in-memory objects when they come into view.

Currently the level file has just two different lines. Either the graphics for a tunnel should change at some depth, or an obstacle should appear at some depth. This seems to work well now, I created a level about 10 seconds long with various obstacles appearing that the player can avoid by tilting the device. It's not clear from this whether or not this would be an enjoyable game, but I think it might be. Obstacle avoidance is a pretty common game element, and players do seem to enjoy it.

I've now come to a sort of mental block. The player cannot crash with the obstacles, they'll just slide through them. I feel that the collision detection code is absolutely crucial to get right. If the player feels that the collisions aren't handled properly, they may feel betrayed by the game. If you die, it should be your own fault, not the fault of inadequate collision detection in the game. But 3D collision detection is not an easy problem. Luckily in my case the player object is very simple, and the obstacles are totally flat.

I was really happy that OpenGL was doing all the matrix operations for me, but now it's coming back to bite me. To do collision detection, I need to know where the vertices are in world space. So I think I'll have to make matrix multiplication code anyway that can mimic what OpenGL is doing, so I can get the post-transform data. After I have the ship and an obstacle in world space, I should be able to see where the flat obstacle is in relation to the ship, then take a z-slice of the ship at that point. After this the collision detection becomes a 2D issue of seeing whether the flat obstacle should collide with a flat slice of the ship.

I also plan to have spherical power-ups and bonuses that can be picked up. In those it could be sufficient to see if any vertice of the world-space ship is inside the sphere.

Thursday, October 16, 2008

iPhone 3D object spinning retrospect

Wow, it works. Last time I was trying to outline what I would need to get a 3D object loader and displayer working, and now about three days later it works -- I have a mushroom I created in Meshworks spinning smoothly on the iPhone. Quite pretty. Now let's see what I listed three days ago and see how it panned out.

I thought I would need bitmaps for the textures and which texture to use with which mesh. Well of course that would be more complicated, I realized I would need texture coordinates as well for each vertex. I decided that texture mapping at this point is not important, I cannot allow myself to become one of those people who tinker on a 3D engine on their spare time. No, this has to become a playable game as fast as possible, and texture mapping usually isn't totally essential to gameplay.

I figured I'd need to have a list of meshes. Now I have a nice 3D object class, each object of which contains 3D mesh instances. Each mesh then contains a vertex list and additionally the color of the mesh, which I could easily get from the file I parse. I was worried about the vertex etc. data loader being complex, but actually taking some shortcuts it is easy to get that information out from a WRL file outputted by Meshworks. I didn't attempt to write a general WRL reader, mine only understands the specific output of Meshworks, so if there is some extra whitespace in the wrong place, it wouldn't work. That means I made the decision to stick with Meshworks, even this particular version of it.

I supposed there would be a list of vertices, then another list of triangles referring to the vertice list. That's how it really was in the WRL file. I made the unnecessary move of rolling out from that data a plain polygon list with no shared vertices, but turns out OpenGL ES would have known how to do that by itself.

I had totally ignored lighting in my original list. To know the brightness of each polygon, I had to specify where the lights are, and the material properties like how strong specular highlights should be on a surface. And to be able to compute these things, of course OpenGL then wanted to know where the surface normals are pointing. I tried to refer to my linear algebra text, but in the end did the copypasta PHP coder thing and just copied the normal calculation routine from some sample code. Well, maybe I mistyped something, but I had to tweak it for hours before it actually calculated the normals correctly. It was really difficult to debug, because just looking at float values in a debugger it's not so easy to say if a vector is pointing to the correct direction.

Another thing I ignored was setting up the projection to look OK. When you create a sample project in Xcode, initially it sets you up with 2D projection. All the sample code on the net refers to some GLU functions to set up a perspective projection, but those are missing from my framework. I guess the right thing to do may have been to again learn from my lin. algebra text how to REALLY do it, but instead I again just copied a working projection matrix from an example. Just too eager to get this project forward!

I've learned a great deal about OpenGL in the past 3 days, and it's exciting to be able to rather easily display 3D objects now. Hopefully this will be useful later, and not just a distraction. I'll try to blog some more about my progress soon.

Monday, October 13, 2008

Graphics time - 3D

I did some profiling and noticed that around half of the time was spent doing the rand() calls. Also I was writing data 8 bits at a time. Changing that to 32 bits immediately boosted performance, but CPU usage was still 100% and FPS was only around ~13 and fluctuating depending on background processes. If what in fact is happening is that Quartz is making a texture of my bitmap and uploading the texture to the GPU and rendering it that way, then I would actually be closer to the metal by just using OpenGL ES directly. Now this is a bit scary for me though, as I don't really know anything about OpenGL. I do know some basics about vectors, matrices etc. but the biggest thing I've done is a rotating cube (which did come 3rd place in a Javascript competition though haha).

I read some introductory text, but the concept of "shaders" bothers me. What the heck is a shader? I remember checking it on wikipedia before, but the best I could understand is that it's some kind of routine executed on the GPU against a vertex, or maybe they can be executed per-pixel too? Just guessing from the names "vertex shader" and "pixel shader". But what is a "fragment shader"? No idea. Wikipedia: "A pixel shader is a shader program, often executed on a graphics processing unit. It adds 3D shading and lighting effects to pixels in an image, for example those in video games. Microsoft's Direct X and Open GL support pixel shaders. In OpenGL a pixel shader is called a fragment shader." Ah, just a synonym.

Now, I have to admit it would be very sexy to display some 3D models of my own. But seriously, no more cubes! I've done so many of them. Always a cube on a new platform, then I get bored and make another cube a year later on another platform. Na-ah, should be a proper model at least now if I try this at all. But how do I get models with some data easy enough to load? I'm scared. Stuff I imagine I will need to load:

- bitmaps of the textures
- list of meshes
- which texture goes with which mesh
- coordinates of vertices in each mesh
- which vertices form polygons

Then to spin a 3D object...

- load object & textures
- do opengl magic to let it know about list of vertices and polygons
- maybe enter into some texture modes before each mesh? dunno.
- to spin, perhaps alter the object space -> world space transformation matrix?
- will opengl remember my vertex list etc. or do I tell it again on each frame? no idea.

So you can see I'm a bit confused about this. Let's see if there is some simple modeling tool for mac.

Saturday, October 11, 2008

Graphics time

I've been attending demo scene events for years, so I have a certain respect for software rendered gfx effects. Now I'm curious about how to push pixels on the iPhone, so let's see how far I can get with that tonight!

First up: diving into the Core Graphics documentation. Hmm.. tried to check how many colors the iPhone screen can actually display. Specs on Apple's page don't mention it. Certainly looks like more than 64k colors, but must be less than full 24-bit color or otherwise they would prominently advertise it as a feature. Just wondering if my framebuffer should be 24bits to make it as native as possible.

... 6 hours pass ...

I created and displayed my first raw bitmap data! Feel free to copy my code (please note it turned out to be too slow for much use). As a disclaimer I just got this to display anything without crashing minutes ago, so there's likely something still wrong with the code. Here's the init part.

CGDataProviderRef provider;
bitmap = malloc(320*480*4);
provider = CGDataProviderCreateWithData(NULL, bitmap, 320*480*4, NULL);
CGColorSpaceRef colorSpaceRef;
colorSpaceRef = CGColorSpaceCreateDeviceRGB();
ir = CGImageCreate(
4 * 320,

And then when I want to show a buffer:

for (int i=0; i<320*480*4; i++) {
bitmap[i] = rand()%256;
CGRect rect = CGRectMake(0, 0, 320, 480);
CGContextDrawImage(context, rect, ir);

My only problem now is that I'm obviously leaking memory by not deallocating anything (I should at the very least free() the bitmap data) and secondly that my code lives in drawRect and is only shown once. I don't know how to get the screen to refresh. Also I have no idea if this will be fast enough to refresh at 30fps, but I'm guessing it should be. It scares me a bit that I can't really know what unnecessary hoops this code is doing on the iPhone, since I'm not really getting a raw display buffer pointer but instead going through some classes that do who knows what before the data actually ends up on the screen.

I discovered another adventure gamish thing that you can do in the Interface Builder - sometimes it's possible to drag code files from Xcode to IB to get IB to notice they exist. I would have never thought about even trying that, just saw it mentioned on another blog. Still trying to wrap my head around the relationship between Xcode and IB.

Ugh! I tried running the above code on a real iPhone device and was only getting around 5fps! Clearly I'm doing something wrong, the iPhone is definitely powerful enough to push pixels if I just figure out a better way to do the updates. But right now I'm too sleepy to think about anything except maybe getting some quality time with HL2DM before getting some sleep =)

Thursday, October 09, 2008

Interface Builder strikes again

Just spent hours on a simple things I couldn't understand. I had a tab view controller in the interface builder, then in my own code in the project I had a class called FirstViewController. To reference this, I figured I should add a FirstViewController type view controller into the interface builder as well. Hilarity ensues as I now have a FirstViewController which is unbeknownst to me already being instantiated by the tab view controller (not sure how that works) so I had TWO instances of the same controller. At first I was really perplexed how on earth my instance variables are suddenly changing values in the debugger, then happened to notice that the address for "self" was different.

I just wanted my FirstViewController to be the delegate of a picker object, but now I had a different instance being the delegate and a different instance doing other things. After learning my mistake I was trying to hunt down the extra instance, and noticed that one of the tabs in the tab controller was already declaring itself a FirstViewController. Well, how to reference that, since the class isn't visible in MainWindow.xib along the other stuff? Took a bit longer to realize I can drag delegate references not only to the xib window, but also to certain visible controls! Felt like one of those Lucasarts adventure games where you miss a puzzle because you don't notice that a certain object was clickable.

Audio works!

Can't believe it's only been two days, because I feel like I've been battling with iPhone audio forever. I was trying to set it up, but somehow my callback was never called and I was starting to lose hope. I tried to keep things minimal, but turned out I was keeping it too minimal because I was neglecting to prime my sound buffers. I thought I wouldn't need to prime it, that I could just start the playback and then fill the buffers as they are requested from the callback function. Turns out the callback is only called when a buffer runs out, and since I had added no buffers it never got called!

To keep the callback function simple, I thought I would just create noise with rand() and fill the buffer with that instead of reading from a file. Again I neglected something important: setting bufferReference->mAudioDataByteSize. It was 0 by default, so the sound system must have figured there is no more sound to play. After fixing that I heard the sweetest sound ever: white noise being played from my phone!

Next up: learning how to use picker view to select sound waveform.

Tuesday, October 07, 2008

Next step: audio

Now that the first test app works and I somewhat understand what outlets are, I wonder what would be the next step? At least I should know how to have multiple views and change between them using a tab bar or similar control. So I should learn basic navigation.

As a brief detour though I am curious about how recording sound works. I have some ideas for apps that need sound recording, upload and download, but am a bit concerned that it might be a bit difficult. At least the network part. How do I know if the net connection is on? How do I show a progress bar for download/upload? Should there be a cancel button in case transfer is taking forever, for example if it happens over normal GPRS? What about compression, is there some basic compression algorithm included in the API?

I recall seeing some example code about sound recording, let's dig that up.

The example is called SpeakHere. Seems that there is no simple recordAudioOKThxBye-style function, but you have to stream it to a file yourself. Fair enough, it doesn't seem to be all that complicated to do, and is probably something I will eventually have to learn how to do anyway. There seems to be PCM encoding built in. Saw passing mention of MP3. I wonder which ones iPhone supports. MP3 would be sweet for shortening transfer times and also using the same files later when playing back from Flash, but is it possible?

Read up on "Audio Queue Services". Documentation mentioned the following "kAudioFormatMPEGLayer3 - MPEG-1/2, Layer 3 audio. Uses no flags. Available in iPhone OS 2.0 and later.", so it would appear the encoder is present. It's a bit overwhelming to set all of the structures at once and hope that I don't miss any vital flags or attributes, so I'll try to start with something really simple. Simplest thing I can imagine is setting up a callback function for sound playback and just fill the buffers with rand(), hopefully white noise can then be heard from the speaker.

Found a useful tutorial on the subject.

First test app works!

Phew, took a nice while to wrap my mind around how the controls work, but now I have a small app with three text fields constantly updating with the accelerometer data. For extra credit I added an image too which moves based on the accelerometer data.

Funny "bug" I had was registering to receive accelerometer events, then not receiving any. For the life of me I couldn't understand why. I was running the app in the simulator at this time. Went to meditate on this by pwning some noobs on Half-Life 2 DM and after coming back and looking at it again it was stupidly obvious - it's a SIMULATOR. It HAS NO accelerometer! So after running it on the real device it worked just fine :-)

How Xcode and Interface Builder relate

I'm starting to understand now how Xcode ties in with the Interface Builder.

My first confusion was this: in the main function when UIApplicationMain is created, how can it know what its delegate is when it is not explicitly mentioned in the arguments? Answer: It's mentioned in the MainWindow.xib file. This file is an XML file which is turned into a "nib file" later (when building?). Double clicking on it in Xcode brings up the Interface Builder. Clicking on "file's owner" and then pressing apple-shift-I brings up the Inspector, where I could then see the delegate -› MoveMeAppDelegate relationship (wtf had to press alt-b to get the › character).

Next I'd like to understand how to reference things set from the Interface Builder from my code. Specifically, how to change the text in a label? What identifies the label in my code?

[24 hours pass]

Okay wow, somehow that was really tough to figure out. To change a text in a label, I needed to get a reference to the label object. I was really confused trying to drag a line from "referencing outlet" to somewhere, with nothing accepting the drag. Turns out this is where the IBOutlet comes into play. I had to have IBOutlet UILabel *label; in the class to which I am dragging to, then the drag will be accepted (although at one point I seemed to sense a delay before Interface Builder realized now the drag can be accepted?).

So the controller that accepted a drag from a textfield "referencing outlet" looks like this:

@interface ThreeFieldsViewController : UIViewController {
IBOutlet UILabel *label;
@property (nonatomic, retain) IBOutlet UILabel *label;

Then additionally in the .m file I had to @synthesize label. Didn't check if it would work without that. Actually, it would be interesting to test if the Interface Builder code that gets generated just sets the attribute directly, or calls setLabel? Let's see. Yep, setters and getters are called if and only if there is a referencing outlet.

As a bonus I discovered that if you make a method and tag it IBAction, you can drag action references from components to that in Interface Builder. Not sure if there are some interesting arguments passed that could somehow be read. Next up: trying to make an app with three labels that get updated by a timer with data from the accelerometer.

Monday, October 06, 2008

Interface Builder

For someone who hasn't coded much, I think it's a bit dangerous to start with a graphical interface builder in an IDE. It gives you the wrong idea that everything is really easy. Just drag and drop stuff and BAM (channeling Steve Jobs here)! Of course you'll end up spending most of the time (as you should) in the actual code, and building interfaces will just be a short break. At school we had tools like this, then there would be people who confused building applications with designing their interfaces, and for them it was a shock how much work there was underneath, not just dragging stuff to build the interface.

With this in mind I am approaching the Interface Builder a bit carefully, almost trying not to have too much fun with it. It of course does make sense to use it. I could create all the components in code, and almost prefer to do so, but still I have to admit that it must be faster to use this tool if I can just learn to use it properly. I want to get stuff done fast, therefore I must learn this. So I've started it up, started dragging stuff around. At this point I still don't understand how this ties to Xcode. I do know there are some "nib files" and that the controls can be raised from it, somehow relating to the initWithCoder method.

The goal for tonight will be to understand how to create some simple text labels in the Interface Builder and then how to set the text to those labels from Xcode.

Sunday, October 05, 2008

iPhone dev 17

Oh lord, I just discovered that curly braces require one extra keystroke on the Finnish keyboard layout on the mac. Somehow the keyboard layout isn't the familiar one from Windows. I would use the USA layout, but then writing scandinavian characters would be a pain. I have to press alt - shift - 8 to get a curly brace!

Spent hours today trying to find out why a sample application won't run on the iPhone. Turned out in my Info.plist file the bundle identifier was the same as with another app, so it wouldn't install another one with the same id.

Next I challenged myself to create a small app which would have three textfields that display the raw data coming from the accelerometers. I got stuck early -- I wanted to use a timer to fire an event at certain intervals. Spent a very long time trying to find info in the docs. Looked at some sample applications, but they were more hardcore and had actual threads to do the timing. Then finally NSTimer was mentioned in a forum post.

timo = [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector(onTimer) userInfo:nil repeats:YES];

- (void)onTimer {
int test;
test = 10;

Disappointed a bit that I couldn't finish this dead simple app in one evening. I'm starting to get a feeling where iPhone development falls on the difficulty scale. Maybe 5 times easier than Symbian development, but still 2-3 times more time consuming than Flash development.

Hello World in Flash from nothing: 5 minutes
Hello World on the iPhone simulator: 2 hours
Hello World on the iPhone: 6 hours (mostly figuring out app signing, device id blah blah issues)
Hello World in Symbian: coder pronounced dead at the hospital due to massive internal bleeding

Code Signing Provisioning Profile

Came across a problem that I noticed others were also having on some forums.

When you create a provisioning profile in apple's portal, then download it and try to go to target info to use it, it often isn't there. You try to click on Code Signing Provisioning Profile > Any iPhone OS Device, but it doesn't show on the list. I'm not sure if there is a more clever way to do this, but I can get it to show by right clicking and selecting "show definitions", then replace the hexadecimal values shown with what I find in /Users/YOURUSERNAMEHERE/Library/MobileDevice/Provisioning Profiles. Then when I right click again "show values", it's there.

Hope this helps someone =)

iPhone dev 16 - memory management

Found an article about the memory management issues. Noticed that I am indeed making a mistake. After allocing and initing an object, I am increasing the retain count by one. This is not necessary, the retain count is already one at this point.

Second mistake I think I am making is failing to release my textfielddelegate and UITextField. Maybe I could use autorelease?

But what happens when I set the delegate by doing testText.delegate = x. Is the retain count of x now incremented by the delegate setter method? In API docs it shows that the property is declared like so:
@property(assign) id delegate

"assign Specifies that the setter uses simple assignment. This is the default."

Okay, so it would seem that the retain count does not get incremented, which means it is my responsibility to release the object at the end. Great, seems to work!

iPhone dev 15 - How do I monitor the events that the textfield sends? Where does it send them to?

Last time I wondered what happens with the textfield events. Now I've read a bit more about delegates, how to use them in practice. Controls have a "delegate", called such because application-specific behavior is delegated to it. For textfields the delegate protocol is UITextFieldDelegate. To implement protocols, angle brackets are used when declaring a class, so here is the code I used in my declaration in MyTextFieldDelegate.h:

@interface MyTextFieldDelegate : NSObject {
- (BOOL)textFieldShouldReturn:(UITextField *)exTextField;

Then implementation in MyTextFieldDelegate.m is very simple:

#import "MyTextFieldDelegate.h"
@implementation MyTextFieldDelegate
- (BOOL)textFieldShouldReturn:(UITextField *)exTextField {
[exTextField resignFirstResponder];
return YES;

I found that "resignFirstResponder" line on some Mac development forum. I tried to read the docs a bit to learn what it means. The docs say unhelpfully that it makes something release the first responder status, whatever that is. In plain english I discovered it means that it makes the soft keyboard disappear, at least in this case.

Well above you see the delegate class, but that wouldn't do much if the textfield doesn't know about its delegate. I think this part must be slightly wrong because I'm never releasing the objects I create, but as an initial test this worked:

MyTextFieldDelegate *x = [[MyTextFieldDelegate alloc] init];
[x retain];
UITextField *testText = [[UITextField alloc] initWithFrame:testFrame];
testText.delegate = x;

The dot notation surprised me. After all the talk about automatically synthesized getters and setters, I expected to actually have to call methods to set variables. After reading the docs a bit more, it turns out that I actually AM calling a method here! The dot notation in Objective-C turns out to be exactly the same here as calling [testText setDelegate:x] and the dot notation is just a shortcut to it. This is very clever, because it allows you expose properties conveniently, but at the same time if necessary allows you to run code when they are accessed.

I'm starting to like this more and more, but memory management still confused me. I wonder how to see what I am failing to release? I don't want to leak memory on someone's iPhone.

Saturday, October 04, 2008

iPhone dev 14 - my first control!

Created my first control in the MoveMe sample program in the file MoveMeAppDelegate.m, method applicationDidFinishLaunching. Defined size of a text field as a CGRect like so:
CGRect testFrame = CGRectMake(10, 10, 100, 100);

Next I created the object itself:
UITextField *testText = [[UITextField alloc] initWithFrame:testFrame];

To see it, I had to do some extra magic. This somehow adds it as subview. Had to call this after other similar calls to make sure it's not obscured:
[window addSubview:testText];

It wasn't immediately obvious that the field even appeared, but when clicking on the upper left where it is supposed to appear a keyboard did pop up and I was able to type. Stuff I'd like to understand next:

- When am I supposed to release this field?
(in dealloc do [testText release] maybe? nope, didn't work.)
- Why when creating UIViewController it is stored in self, and then released? Won't that destroy it? Apparently not.
- How do I output some debug text?
- How do I monitor the events that the textfield sends? Where does it send them to?

Friday, October 03, 2008

iPhone dev 13

Already the fifth day of development, with not much to show for it. Going through the MoveMe sample application now. Yesterday got an OpenGL ES sample running. So cool that a little device like this actually runs OpenGL. I don't yet grasp the structure of programs very well. I know that there is some function you call in main, which starts off the message loop and apparently your own code goes into a delegate class.

Learned that building an Xcode project turns it into a "bundle", which a directory on the iPhone that contains code and data. Since I don't believe anyone reads this (just talking out loud to concentrate more on the task), I guess I can reveal what it is that I'd like to build. Well, I thought the iPhone would be great for drawing for the "draw and guess" game. I have a Flash version of it mostly working on MySpace (haven't released though). You draw something, others guess what it is, others draw, you guess. Guessers and drawers get points, repeat. People seem to like that game, and it would be cool to draw with your finger. I'm now pondering whether iPhone users should only be able to draw, or to guess too. Maybe it's my unfamiliarity with the iPhone soft keyboard, but it seems too painful to seriously use as a part of the game. Maybe I can let players choose a game mode. But if everyone is just drawing, will there be enough people left to guess?

Another game which occurred to me after getting this phone is one where you are given a topic, then have to try to photograph that thing in a very limited time using the camera. For example "take a picture of a fork!", then you rush out to your kitchen (hopefully connected to the net with WiFi), take the picture. Judging whether people took pics of forks or not would be done peer-to-peer. You get a picture and have to decide whether it is a fork or not. Maybe with some Slashdot style meta-judgement too to make sure you are giving correct judgements. Well, just a crazy idea.

iPhone dev 12

Sample app is now running on my iPhone! It took a good while to wrap my head around the app signing procedure. I don't think I completely understand it even now. I had to get a certificate (for signing my code?), put my name in some settings, create some kind of "provisions" (some kind of combination of everything else), app id and other stuff too. Important thing is that it runs, and I can now concentrate on coding. Perhaps I will have to really understand this better if I get other members in my team, or maybe deployment won't work without understanding it (although I hope it will).

It's now 5 am and this could be a good point to get some sleep at least after the biden - palin debate. Xcode seems awesome and I'm excited to learn more about using it.

iPhone dev 11

Couldn't even sleep with all this new stuff beckoning me to hack some more. Trying to install iPhone SDK, but realized I don't even know where installed software appears on a mac. Managed to open a terminal and started building the locate db in order to find it. It was also weird that the pipe character moved to alt-7. Not complaining, I was getting bored with my current system, this is refreshing. DB is built. Locate says it's in /Developer but how to start the IDE?

Oh yeah, those strange plus and minus signs in method declarations mean whether the method is an instance method or a class method.

iPhone dev 9 - got the hardware

Today all the hardware arrived. An iPhone, a WLAN box and a Mac Mini. This stuff really was disruptive to getting things done -- I haven't felt this much like a child since... well since I was a child! Walking to the post office to get the iPhone almost turned into a run because I couldn't wait to play with it. Then when I had everything, stuff worked really well right from the box. Noticed though that I accidentally got a 2G iPhone instead of 3G one, but that doesn't really make any difference for development (good bye to my plans on starting to use VoIP w/ Asterisk though). I feel like I joined some cult now that I have this stuff. I even read the holy texts of Apple -- namely the folklore about pirate flags and Woz's pranks. I have truly joined the dark side.

I have tried to get back to reading the cocoa fundamentals documentation even while I feel a bit giddy and would prefer to just play around with this stuff. I have to remind myself I got these for a purpose -- to develop an app for the iPhone which will then pay for this expensive hardware. So with this in mind I've muddled through the fundamentals documentation, but it's getting awfully abstract. Well of course it is abstract, because I just reached the design patterns part. Fun to read about abstractions in something which in itself already feels abstract to me at this point (Cocoa programming). I think for my motivation's sake I should try to get something going with Xcode while I read forward.

Thursday, October 02, 2008

iPhone dev 8

My iPhone and Mac Mini will arrive today. Might be disruptive to my Cocoa study.

"Sometimes using a protocol can avoid subclassing". Not sure what that means, not sure what "delegates" are. Code is in .m files, headers in .h files. Saw how to declare classes. Instead of C-style "#include", "#import" is used instead. It's like require_once in PHP.

-- function and data type declarations --
@interface ClassName : Superclass {
-- instance variables --
-- method and property declarations --

The .m file could then look something like this:

#import "ClassName.h"
@implementation ClassName
-- stuff --

If I see "IBOutlet" in code later, that is somehow related to "nib files" and the Interface Builder synchronizing with Xcode. Vague at this point. Documentation mentioned that on the iPhone the applicationWillTerminate method gets called when the app shuts down and is the place where state should be saved.

Getters and setters can be automatically synthesized. "copy" and "retain" tell whether object variables should be copied or if the pointer should be stored instead and retain count incremented. Something very strange was mentioned about "KVB", "KVC" and "KVO" that I had no idea about.

Cool thing: in printf strings you can say %@ and then provide an object, and at that point any string returned by that object's "description" method will be inserted. There was a page about threads. Said exceptions should be handled by each thread, cannot be thrown away from thread. Talked about how error-prone thread programming is, that I should copy data and try to minimize possible conflicts arising from shared data. Events should just be handled by main thread, also UIKit objects should only be used in main thread. I imagine I may use threads with socket programming. Said not all Cocoa classes are thread safe.

iPhone dev 7

There is a windows style event loop. On Mac it lives in NSApplication and on the iPhone it's in UIApplication. In AppKit.h there is a method NSApplicationMain that creates the application object, sets up an autorelease pool, loads UI from something called a "nib file" (apparently a file that contains files, maybe even directories?) and starts handling events. On iPhone the equivalent method is called UIApplicationMain.

@"test" is shorthand for creating an NSString that contains "test". In some cases empty string @"" can mean no value / default value. String literals shouldn't be used as dictionary keys? Setter methods are called setSomeVariable, but getters are just "someVariable". Typical framework usage: create subclass, override methods to implement own functionality. Cocoa uses MVC.

Wednesday, October 01, 2008

iPhone dev 6

Init may return a different object than was allocated. For example in singleton case it may return the already existing object. For this reason should always use the one returned by init. Objective-C seems to support exceptions (or is it a Cocoa feature? I'm confused about the distinction). Self, super. Strange plus and minus signs near method declarations. Maybe plus signs have something to do with factories? Noted in explanation about the "respondsToSelector" introspection method, that it tells if an object responds to a certain method -- so "selector" does indeed mean a method? "autorelease" was mentioned many times, but don't know what that is. Section about class clusters: public superclass with many private subclasses, you instantiate the subclasses through factory methods in the superclass. For example Number superclass which can create Ints, Floats and so on. Skipping sections about class cluster details and "creating a singleton instance". I'll return back to them if the need arises to create my own cluster objects or singletons, just too tiring to read about them now.

iPhone dev 5

SEL is data type of a selector, but couldn't really understand selectors are. Are they methods? Reference counting is called "retain counting". On alloc the retain count is 1. If the retain count reaches zero, the "dealloc" method gets called on an object and after that the memory is released. If you "copy" an object the retain count (usually?) becomes one for the copy. There are things called "autorelease pools", but their use is discouraged in iPhone. Somehow everything in the pool gets released at the same time, and somehow objects can be added to such a pool without directly referencing the pool by name (at least the sample code looked like that). App kit on Mac has some kind of autorelease pool already created in the beginning. There are some conventions on when to call release on objects. If an object is created by you, then you should also release it. If you get an object from somewhere else, you shouldn't. There was something related to class factory created object releasing that I didn't understand. alloc -> init -> usable object. In addition to allocating memory, the alloc method also sets a cool explicit "isa" property for the object, that points to the object's class. Also zeroes all properties.

iPhone dev 4

@property is syntax for declaring class methods that automatically create getter and setter methods. Enumeration of sets can be done with the nice "in" syntax as in some other languages. Calling object methods has a bit strange syntax, [object method]. Also possible to give some named arguments, but not sure if the first before : is a method name or an argument name. [object keyword1:something keyword2:somethingelse]. Where is the method name? Is it "something"? Not sure. NSObject is root class of everything, and defines some methods like init (constructor?) and reference counting (retain, release).

iPhone dev 3

NSObject is the root class for Cocoa classes. Stuff starting with UI prefix is UIKit related. Objective-C has garbage collection after version 2.0, but it cannot be used on the iPhone because of performance. Cocoa classes seem cleanly designed. Didn't encounter a regexp class, although didn't check if it's in NSString methods. Event mechanism in iPhone UIKit differs from Mac Application Kit. Looked at Objective-C example code, saw lots of weird square brackets. "id" datatype can hold any Cocoa object, so convenient for enumeration. Dynamic typing, binding, loading. New feature: "categories". By dropping mysterious @ marks in strategic places in your code, you can add methods to existing classes without subclassing. Protocols are like Java interfaces.

Tuesday, September 30, 2008

iPhone dev 2

There seem to very cool tools for debugging and coding. The "Instruments" application seems impressive, creating data porn from your app as it runs. It isn't mentioned whether that can be used when developing for iPhone. I learned for iPhone there is a different compiler, on Mac gcc is used. The iPhone simulator needs to be compiled for, so it really is a simulator and not an emulator. The iPhone needs some special configuration to start development on it.

iPhone dev

I have never used a mac. Once I sat at one in a computer room because other computers were taken, but I couldn't figure out how to turn it on! So with this background let's see how far I can get with iPhone development. I started off by ordering a Mac Mini, buying a jailbroken iPhone from an auction and signing up for the developer program. While waiting for the hardware to arrive I can use the time productively by reading the documentation. Stuff I've learned in the first hour:

"Aqua" is probably what the UI is called on a Mac and "Quartz" is some kind of rendering system for it. You develop software with "Cocoa", which historially comes from NeXTSTEP. The docs talk about "Darwin", which according to Wikipedia is a flavor of UNIX (isn't that Mac OS X? I'm confused). "Carbon" is something I should ignore. Apparently Cocoa also comes with an IDE, which is perhaps called Xcode. When developing for the iPhone my app will take over the whole device -- only one program is running at a time + some background daemons. SQLite and OpenGL are used somewhere by something. On the Mac Cocoa consists of "foundation" and "application kit", but on the iPhone the app kit is called "UIKit". Foundation does non-gfx things. The language stuff is developed on is "Objective-C", which is some kind of superset of ANSI C but with additional OOP features inspired by Smalltalk. Some lower levels use just plain C, but in Cocoa there are OOP wrappers for them.

Let's see if I have the energy to continue, or if I decide to auction off my hardware when it comes :)

Thursday, August 07, 2008

Dead-end stocks?

It's funny how some companies seem to be riding on trends that seem doomed to me. I've been going alphabetically through Scandinavian stock listings, and so far I've encountered three such companies: AudioDev, Anoto and Cash Guard. AudioDev makes testing equipment for optical media. Anoto makes system to read hand-written forms. Cash Guard makes cash handling systems.

In my image of the future all content is accessed through the Internet, therefore there will be no optical media, and so there won't be any need for testing equipment for it. All official forms, questionnaires, multiple-choice exams etc. will be filled electronically, so there won't be so much need for OCR / choice-reading machines. Cash will be phased out gradually, where it might always exist but will be used less and less.

These three companies don't seem to be riding on very good trends :)

Media circus has arrived

Finnish media companies have gotten excited about "social media". They see it as a trend, and they must report trends. To support trend reporting, they want to raise individual examples, hopefully ones that people can relate to. I happen to be one of the few serious Facebook developers in Finland, so I tend to be that example.

It's been fun. First I was featured in some magazine called "Happi". I'd never heard of it, and even with numerous requests they never sent me the issue that my interview appeared in. But apparently someone read it, because next I was contacted by another magazine called "Image". Now this one I had heard of, it's a very high profile magazine. I got my face filling an entire page, it feels unreal. Seems that stories in media inspire more stories in other media, because next I was contacted by "YleX" radio channel and "Helsingin Sanomat", the largest circulation newspaper in Finland. Just wow.

Friday, April 11, 2008

Silly Facebook apps are keeping my stomach full and a roof over my head

This is a long post, but lots of exciting things have happened since the last one!

The Facebook melody composing app I was making in August 2007 didn't take off. Perhaps it just wasn't a very good app for making melodies, or perhaps people just aren't creative enough for it to become viral. I abandoned it and carried on making other ones.

Boy, was that a good decision! It's been a jumble, so I can't even remember the proper chronological order of things, but I think the next app I made was "Your Japanese Name". I thought it would have a very limited appeal, but actually half a million people have installed it now! Wow. Also, it's now making enough money for me to pay my rent.

This got me thinking that perhaps other easy-to-use things that let people express their identity on their profile would do well also. Therefore I started adding to my list all kinds of other apps that fit that description. Some people are looking down on these apps and calling them "badge apps" with no real value, but I believe that ultimately it is the users who decide what is entertaining for them.

I made several. Lots failed. I think failure is good too, it's great to know what doesn't work and try to ponder why. I made one which comes up with a description of the past life of the user. Like it might say that you were a cave explorer in your past life, and that's why you are so adventurous now. Perhaps that was a bit too random, showing that off in your profile doesn't really tell anything about your identity. I'm happy that it failed, because it raised my belief in humanity a bit, that people won't accept just any crappy app :)

Just to make sure though, I had another app made which is similar, but it shows you what you will be reincarnated as in the future. That failed even harder. Strangely though later someone else made a past life app similar to mine, except it was a huge huge success. I believe it was because of the forced invite system though, and not because the app itself had some merit above mine. Well, the author got away with that and made a lot of money, so maybe I should have put such a system in place too.

There are about a dozen failed apps which I made. I'll describe more of those in other posts. Let's talk about one success for a change. It's called "Name Analyzer", and like "Your Japanese Name" it displays your name in your profile, but just in plain english, with some adjectives attached to explain what each letter in your name means. I can't claim it's an original idea, but it hadn't been done for Facebook yet. Basically it has the merits of "Your Japanese Name", but doesn't limit the userbase to only those with an interest in Japan.

It took off like a rocket and now has nearly 7 million installs, with a hundred thousand people using it every day. I wrote last year about my scalability and bandwidth worries, but now in retrospect that hasn't been an issue. I'm not paying significantly more for hosting now, and just one server has been enough to handle the load, and even that server has been mostly idle, even though the server logs are really flying.

Oh, I lost a lot of impressions though because the logs are REALLY flying, and I hadn't realized how large they would become! I had 8GB of space left, so I thought I would be OK. However, with 100k people each doing several page loads and causing longish lines to be appended to ever growing log files, I actually ran out of space. Many times. Every time the velocity of the expansion of the server logs took me by surprise! Running out of space is a bit nasty though, because it corrupts MySQL tables. Luckily the repair worked!

After seeing that people have an interest in it, I have added more features to make the app stickier, with pretty good success. Now there is functionality to decide analyses for your friends, change colors and fonts and backgrounds, make your own themes that others can use etc., all ever so slightly increasing the frequency of people visiting the app and the length of time they spend in it. Not a big difference, but with user numbers this large even a small percentage difference can be significant.

With monetization, I believe I have signed up for the best advertising network, which is Social Media. I do feel that I haven't really tried out all the other alternatives, but they pay so well that it's difficult to imagine that the others could be better. From discussions on the developer forum, there seems to be a consensus that Social Media pays the best. Only other options I have tried are Google AdSense and selling merchandize through Zazzle, but both of them paid an order of magnitude less than Social Media. Perhaps I should properly try out Cubics, VideoEgg and the others though, just to be sure.

Even though the merchandize (mug cups with your name on it) failed, I'm not sure that it was because the idea itself was flawed. Maybe the product, or having a link to an external site was the problem. Maybe if there was something that could be easily purchased while on Facebook, I could monetize better. Only ideas I've had so far are mobile phone background images or premium SMS subscriptions to get your name analyzed for your phone. That's something I might explore more.

So, how has this increased ad inventory affected my life? Well, I already told you that "Your Japanese Name" has allowed me to pay my rent, so you can guess that "Name Analyzer" has been even more significant in that respect. I won't make any big purchases though, only major thing is that I'll probably travel a bit more, still on a shoestring budget though. I'd like to think I'm pretty responsible when it comes to my spending. I don't have a car, I live in a normal sized shared apartment. I eat pea soup out of a can for dinner. I even have a spreadsheet with an inflation-adjusted plan for until I am 74, the age at which an average Finnish male tends to die. I don't have enough to live until then, so I'll keep saving. Couldn't resist getting a projector and a Nintendo DS though ha ha :p

I've gotten some very VERY interesting requests to appear at job interviews, but I don't really feel comfortable letting others decide what I should be making, now that I'm having so much fun thinking up new apps :) And I do have a lot of ideas! That list I started last year is getting really long. Lots of them are obviously stupid ideas when I look at them now, but for some it's hard to say! I hope I'll have success making some other type of app, I wouldn't want to be a one trick pony!