Wednesday, December 31, 2008

Defend The < Insert Anything >!

Happy New Year! It's been a while since I last posted but I've started a few other posts, and will hopefully will be getting them out of the tube sometime soon. 

When I was younger, my dad, who was in the military, told me that whenever the Army is going to attack a fortified position, they always tried to have a 3 to 1 advantage on the defenders, to minimize their casualties. This is because it is much easier to hold ground than it is to gain it, especially if you know the lay of the battlefield better (or even fortified it yourself). You can concentrate your resources in certain chokepoints, and gain significant advantages on your adversaries. Unless taken by surprise, it is very easy for a defensive position to hold back wave after wave of attackers.

I think that this is one of the primary reasons that the "Tower Defense" style of games is getting very popular, especially in the casual/mobile gaming scene. Usually termed as "Strategy", "Action Strategy", or something similar, these games have a very basic premise: you've got a thing, the bad guys want to blow up your thing, don't let them blow up your thing. It's actually quite ingenious the way game designers have kind of spun us on our heads. Back in the days of yore, the player was always attempting to get past the defenses, rescue someone or something, or blow something to smithereens. Now, players are tasked with keeping out all those pesky interlopers, though the basic characterizations are the same.

What do I mean by that, you ask. Well, when one character was able to singlehandedly slaughter the entire Russian army with nothing but a toothbrush, some people called that "unrealistic." However, when that single person is tasked with repelling wave after wave of robotic archer demons, that's much more believable.

Like in my post about zombies, people who play video games like to suspend disbelief far enough to accept the game world, the (usually ridiculous) story, and the mystical powers that they are somehow imbued with. However, granting a modicum of realism, such as having them set up defenses, and actively engaging swarms of enemies, allows them to feel empowered, but not invincible. When the player feels like they can do anything with little or no consequences, the game breaks down somewhat, as the player tears through the levels, unconcerned about what will become of their avatar.

That's all I have to say on the subject, don't worry, I promise that my next post will have some code in it. ;-P

Thursday, December 18, 2008

The Analysis of the Dead

In this past week's Zero Punctuation (I should really stop getting my blog ideas from it, I'll work on that), Yahtzee remarked that besides Pirates, Monkeys, and Ninjas, Zombies were perhaps the most popular thing among "nerds". Specifically gaming nerds. Yes, the "legitimacy" of zombies in popular culture is growing every year. What used to just be a cheap horror movie effect has actually been used to create artsy, compelling narratives and games. Need proof? Check out "Fido."

Sure, zombies are getting big. Pirates got big, Ninjas got big, monkeys...I think their time has passed as well. Zombies seem to be sticking around for a bit longer though. It's probably partly due to the whole "decomposing" thing. All joking aside, what makes nerds, a generally erudite and intolerant bunch, so happy to have an antagonist that is slow, dim-witted, operates in hordes, and attempts to eat brains?
Hey, wait a second, I think there might be something there.
Zombies are the exact opposite of what a stereotypical nerd wants to be. They're barely above fungi when it comes to IQ, and have an unstoppable need to attack those that aren't one of themselves. Is it just me, or does this sound exactly like the kids that made fun of you in high school because you'd rather read than watch "Grey's Anatomy"? In fact, when you think about it, what do zombies usually want to eat? Brains? What's the only way to kill them? Kill their brains? If someone was coming up with these metaphors, they weren't digging very deep (probably about 6 feet). 
What naturally follows the intelligence disparity is the common criticism nerds have for "normal" people, that they act like sheep, meandering pointlessly around in herds. Or hordes. Used to social exile, imposed either by self or by others, nerds generally scoff at being part of a large group. They don't want to be following the trends that everyone else follows, because everyone else is stupid, and probably wrong. If you hadn't noticed, this is where the intolerance comes out. Although they commonly accept people who are "different" into their ranks, many nerds find non-nerds to be unbearable. Therefore, any nerd anathema is most definitely going to exist in some sort of large group. This serves another purpose as well. By making their antagonists a seething mass, each individual is deidentified, and it doesn't really matter who's head you're having to lop off to get to the rescue zone. It could be your old gym teacher, that cute girl who said you smelled funny when you asked her out, anyone. 
All that being said, I have to say that I'm still a zombie fan. Although I like to think of myself as a very tolerant person, I'd definitely agree that part of my liking of the zombie metaphor is a bit of distrust and dislike of the general public. Not exactly as a mass of individuals, but the whole societies themselves. To me, the zombie hordes represent the selfish, immutable cultures that are sending all of us to hell in a handbasket, finally imploding upon themselves, making way for a new world, while still representing major obstacles to that world as they slowly rot away.
Or maybe I'm just hungry for BRAIIINS.

Tuesday, November 25, 2008

Thoughts on Advertising

This is actually a topic that I've been thinking about for a while now, going back to a conversation I had at with Corvus Elrod at his "moving to the West Coast" party. We were discussing advertising in games, and generated a lot of interesting points. As a storyteller, it is anathema for him to see anything that pulls a player out of the game experience, and have them thinking about products in the real world. I agree with this, to a point. When one thinks of the rise of casual games, and the free product business model, it has to be conceded that advertising is one of the main revenue streams (and therefore driver of) the games industry. Armor Games was a company that I though had nothing but cheesy, vapid fare, until I was given a disk by the people of the Indie Games Showcase. Some of their offerings, particularly in the puzzle category (which I usually avoid), blew my mind with their innovations. This company, which is producing quality (though flash) games, is completely supported by the ads on their site (as far as I can tell). 

However, I just bought a new cell phone (a Google G1, in fact, and I love it), and got my first telemarketing call on it a few days afterward. Rather than encouraging me to buy whatever product they were hocking, all that call did was remind me to add my number to the "Do Not Call" list. So, the real question here is, why don't I mind unsolicited advertising next to games I play online, or even in my email, but there's a national response to telemarketing and spamming?

One word: Interference.

If advertising interferes with what you're trying to do, then it's a problem. Calling my phone, or sending me email, wastes time that I would otherwise spend doing other things. Interrupt me, and you're not hooking me as a customer. If I'm already on a website (or watching TV, or at some sort of sporting event), I'm an audience member already. If I happen to look at an ad that is up while I'm waiting for the next play, that's not interfering with what I'm trying to do, it's not breeding any resentment in me. In fact, if it's entertaining enough, I might even be glad it's there. Just look at how many funny/amazing commercials have been posted, by random people, up on YouTube. I readily admit that I would not buy Brawndo if they hadn't gone with an over-the-top, hilarious ad campaign.

Personally, I have no problem with advertising. In its most basic form, it's just trying to get you to buy something. That's fine, if we didn't buy things, we wouldn't really have an economy. Buying things drives innovation, and makes our lives easier. I even don't have a problem with targeted advertising. Some people say it's an invasion of privacy, and it is slightly, but personally, I'd rather not be pitched Depends while I'm checking my email. I see nothing wrong with companies collecting information about buying habits, customer satisfaction and whatnot. It's all supposed to make things more efficient and convenient.

However, if advertising gets too pervasive, too omnipotent, then we have a problem.

Wednesday, November 19, 2008

Dealing with Null or "All" Parameters in MS SQL Reporting Services

I finally had something happen to me at work that I can add to this blog. Oh happy day, I feel as if I might faint.

Either way, I've recently become the SSRS "rock star" at my office, where even my bosses are stopping by and asking me questions about how to get their queries/formatting to work. It's not as glamorous as I thought it would be, and the groupies haunt my nightmares, but it's fun...ish.

Anyways, the person who was running the Reporting Services before me had an interesting way of dealing with using parameters to filter results in the SQL Query. I know there's a way to filter just using the wizard, but I'm of the opinion that it is probably faster (or at least more efficient) to do filtering in the query. For our filters, if the user doesn't want to use that filter, the first option says "All", but actually has a value of null. The way my predecessor would handle this would be to have a single query for every combination of null/not null parameters. So, if there was one parameter that could be null, there would be two queries, and IF statements to select which one. If there were two nullable parameters, that number raised to four. Three meant nine separate cetera

This wasn't really acceptable to me (mostly because I didn't want to have to add five more queries), so I looked to see if it was possible to use the IF statements right in the middle of the WHERE clause. No dice. So, what I did instead was use the following style of WHERE filter:

WHERE/AND (@param is null OR object.RelevantValue = @param)

This way, if the parameter the expression would still evaluate to true, and the filter statement would only matter if the parameter was actually specified. Now, the report query is 1/4 of its previous size, and 1/9th the size it would've been if I had done it the old way. Rock star? Maybe, maybe not, I don't care. This is just what I do.

Monday, November 17, 2008

Tables for Layout?

I love this

It was a response to a page that claimed that the average time that a web developer took before abandoning CSS for web layout was 47 minutes. Anyone who has worked with me (or watched me work on a web project) will vouch that I have toiled untold hours of my life away, simply trying to get a CSS layout to work. It warms my heart to see that other people are willing to be as fanatical about standards and best practices as I am.

Also, I'll be at the Philadelphia Game Expo this weekend. If you're there, say hi!

Sorry about the lack of posts, I've been working quite dilligently in what little free time I've had, but hopefully some more posts will be forthcoming soon.

Wednesday, October 22, 2008

Lock's Quest Recap

This is sort of a game review, but also a discussion about interface design, and the nature of fun in games. I recently purchased Lock's Quest, which was the best Strategy game at E3, which for a DS title, is pretty impressive. It's developers also put out Drawn to Life, a game that I wanted, but never got around to purchasing. Basically, you're a special boy in a war...blah blah blah, the story isn't really that important, but the game itself is really involving. 

You have so build walls and turrets to fend off increasingly large and tough armies of clockwork robots. That in and of itself sold me, but the DS's touchscreen is really well implemented in both combat and repairing damaged structures. Depending on what you're trying to do, there's an associated task (pulling a lever, spinning a gear, etc) that makes the action you're attempting go faster, or more effective. While this sounds kind of hackneyed and repetitive, they're mixed up nicely, and you don't just sit around for 10 minutes doing the same thing over and over again.

I'm also really impressed with how they set the flow of the game up. After a few rounds with your defenses, you have an impenetrable death-fortress, and you're able to handle even the most vicious onslaughts. Then, the game tells you that you have to go grab an objective all the way on the other side of the map. It's frustrating, yes, but it breaks the whole build-repair-fight cycle, which is nice. It's also worth noting that there's few things more awesome than watching wave after waves of bad guys crash on the rocky cliffs of your defenses.

One thing is infuriating though. The screen doesn't zoom out, and there's a fixed speed to the camera, so panning from one wall to another can sometimes take precious seconds. This in and of itself is not too bad, since (if you're like me) you can set up relatively smaller, close defenses. However, when you pan to a location, and then tap (which tells your character to move there), the AI that keeps the main character from running aimlessly into a tree is a little bit lacking. So far, I have lost a solid number of turrets, and even a level or two, just because there was a rock or tree or bit of something in the way, and the flax-haired hero did nothing except struggle against the properties of matter until I realized and gave him an easier location to get to (after a while, you get used to tapping out smaller, shorter paths).

With that behind me, I do have a few more good things to say about this game. Despite the "ehh" story, the game has a good length to it. I thoroughly enjoy the game mechanics that are in place, and was worried that with all the cool stuff in it, that the levels would be cut tragically short, leaving me wanting more, much like when you when you visit home, and only get a single strip of bacon, because that's "healthier." Don't worry, this game has platters of bacon. I've been playing it for about two solid weeks now, mostly during my commute and at home and I think I'm only 2/3 of the way through it. 

Also, the difficulty is really well done. While I didn't like that there wasn't a standard dial-a-difficulty, which would be pretty easy to do with this game, I was able to clear most of the levels on the first try, though a good number of those were nail-biters, and a couple of them required a do-over. The increases in challenge over time are well-paced. I never felt too bored, and when I was right on the edge, something new popped up. Also, there's a fun little seige mini-game which would be right at home as a flash game.

Monday, October 13, 2008

Using A LIKE Operator With A String Parameter In SSRS


Just something quick that I figured out today, that I felt warranted sharing. I'm doing some work with SQL Server Reporting Services (SSRS) and the Business Intelligence Development Studio (BIDS), creating reports for work. Today, I was asked to create one that used a LIKE statement in a WHERE clause where the argument passed to the LIKE statement is a string parameter. The challenge here is that concatenating "%"s to the parameter didn't work in the query, nor did the same process in the Dataset -> Filter By tab. The only possible answer I could find was on experts exchange,  which requires a subscription, so I had to figure it out myself. Here's what I did:

  1. Create the String parameter you actually want to search with. Make sure that it can be blank, but don't let it be null. For the sake of the example, I'm calling it Filter.
  2. Create another, hidden parameter that is also a string. Make sure it is hidden! Then, set the default value to be non-queried ="%" & Parameters!Filter.Value & "%". I gave this parameter the name FilterFormatted. Now, hit okay and close out your Report Parameters.
  3. Now, in the query, all you need to do is write "WHERE columntofilter LIKE @FilterFormatted". If you're testing the report, make sure you put the "%" before and after your search string, but going to the "Preview" tab will let you test it the way it'll be deployed (though you should probably know that already).
That's it! Interesting problem, simple fix, solution posted.

Wednesday, September 24, 2008

OpenSocial Version 0.8 App Development From Scratch

With the newest version of OpenSocial out, I felt compelled to revisit it, despite the fact that my new job couldn't have me farther from it. A labor of love, some would say, to stay up-to-date on a constantly evolving platform, like trying to hold onto pudding with your bare hands. Anyways, where was I?

I'm going to develop an OpenSocial 0.8 app from scratch. Let's get started. First, I need to set up the initial scaffold.

<?xml version="1.0" encoding="UTF-8" ?>
  <ModulePrefs title="Pandaface">
    <Require feature="opensocial-0.8"/>
  <Content type="html">
         <div id="gadgetdiv">

This app is going to be named "Pandaface," as you can see from the line (  ModulePrefs title="Pandaface"). It's also important to point out the gadgets.util.registerOnLoadHandler line. This sets the Javascript function that will execute when the gadget loads. Usually, one would define this in the body tag of an html page, but declaring it in that manner makes sure that the callback function (in this case gogoGadget, I prefer that over "main") won't execute until all the gadget code and its dependencies are loaded.

Step 2 is setting up the first data request. In my first request, I like to get the OWNER and VIEWER objects. These will allow me to know if the viewer is looking at their own version of the App or someone else's. To do this, gogoGadget is changed to look like this.

function gogoGadget()
var request = opensocial.newDataRequest();
request.add(request.newFetchPersonRequest("OWNER"), "get_owner");
request.add(request.newFetchPersonRequest("VIEWER"), "get_viewer");
This creates a request object, adds two FetchPersonRequests, and sends the request out, saying that the function 'response' will deal with the data response. It is important to note that the callback will not execute until it has all the data. Let's lake a took at the response function.

function response(responseData) { owner = responseData.get('get_owner').getData(); viewer = responseData.get('get_viewer').getData(); };

Not terribly interesting at the moment, but it is important to note a few things. In gogoGadget, the last paramter that is given is a name for the specific response. MAKE SURE THESE MATCH UP. In my other OpenSocial adventures, I got that mixed up at one point, and couldn't for the life of me figure out what was going wrong.

Now, let's pull down some persistent data. I'm just going to have a single variable, face_data, and pull it from the orkut sandbox server. To do this, I have to declare another datarequest, and add a newFetchPersonAppDataRequest. Back in .7, you could just pass the user's id, and the name of the app data you wanted to pull. Now, however, you have to create an IdSpec object for it. This is new, and fairly unintuitive (especially to people who've been working with it as long as I have).  Here's the full data request.

var ownerspec = opensocial.newIdSpec({ "userId" : "OWNER" , "groupId" : "SELF"});
var datarequest = opensocial.newDataRequest();

Make sure to pay attention to what you name the variable, like with the personrequest. You also have to make sure your idspec is constructed correctly. At this point, you can only have "OWNER" and "VIEWER" for "userId", and "SELF" or "FRIENDS" for "groupId." There's also an important thing to note at the 3rd line. The first "face_data" is actually the name of the variable the request is supposed to retrieve. Along with named variables (which return as "null" if they haven't been set yet), you can also just put a *, for all variables associated with your app. On the response side, there is another main difference from the person requests.

function loadUI(responseData)
          var facedata = responseData.get('face_data').getData()[owner.getId()];
          var faceurl = responseData.get('face_url').getData()[owner.getId()];
          var to_output = owner.getDisplayName() + " is a ";
          if(facedata == null)
            to_output += "lazy";
            to_output += facedata.face_data;
          to_output += " panda.";
          if(faceurl == null)
            faceurl = "";
            faceurl = faceurl.face_url;
It is important to note how to retrieve the data from the response object. Not only do you have to call get() with the field name, and getData(), but you must also grab the data from the index of the owner's id, and then call the name of the variable off THAT. 

What this does is set default values for facedata and faceurl, and then combines them to make the entire gadget rendering. Now, if you look at my profile, either as me, or as someone else, you see the following:

Not terribly exciting, but you can see what it's trying to do. Now, to add a bit of user functionality. For this app, the only real interactivity is going to be when the owner changes their emotion or the panda's face. To do that, we first have to see if the person viewing the app is the owner. That's pretty intuitive:


See? Now, what I do is build an admin panel/form that has a text box for the user to input their emotion, and then radio buttons.

owner_output = '

New Panda Emotion?
            owner_output += '
New Panda Face?
            owner_output += '
            owner_output += '
            owner_output += '
            owner_output += '
            document.getElementById("gadgetdiv").innerHTML += owner_output;

For the sake of brevity, I'm not going to detail both changing functions, but just the important bits of changeFace. You get the value of the new emotion, and then create an updateAppDataRequest and submit it. You still need to specify a callback though. This lets you do error checking to make sure that the request made it through alright.

var updaterequest = opensocial.newDataRequest();
 updaterequest.add(updaterequest.newUpdatePersonAppDataRequest("VIEWER", "face_data", newemotion), "update1");

For finishUpdate, I just have it mirror the initial data pull, and loop back to loadUI, thus completing a beautiful life cycle.

        function finishUpdate(responseData)
          var ownerspec = opensocial.newIdSpec({ "userId" : "OWNER" , "groupId" : "SELF"});
          var datarequest = opensocial.newDataRequest();

Obviously this example could be fleshed out a LOT more. By adding posting to the activity stream (which I may still do) and a much prettier, this app could be a lot more solid. All in all, it's not bad for a few hours of work. The full gadget xml is at:

Monday, September 8, 2008

Browser Inconsistencies: Part 2


Today was my first day at my new job. Exciting, but nothing notable yet. Hence, this is a relatively shorter entry.

Anyways, I was playing around with HTML and CSS again, putting together a chess/checkers-style board with HTML and CSS, and found another browser inconsistency.

I have a 4x4 grid of alternating white and black tiles. The tiles are arranged in rows, and there are tokens that are placed absolutely on top of them. The rows of tiles and tokens are all in a master div. The rows are each divs that have a clear: both property to make them cascade properly. At first, I didn't define the horizontal position of the tokens. In firefox/opera/chrome, you see the following:

The absolutely positioned tokens are removed from where they "should" be, and are placed in the upper-left corner of their containing div (the white token has been defined to be that low). However, in IE8, this happens:

Apparently, if a horizontal position is not specified, the absolutely positioned elements stay where they would, even though they are removed from the box model. You can tell they have still been removed, because if they hadn't, then the border would be surrounding them as well. Setting the left: 0px; property makes IE8 render the same as all the others, but it is still interesting.

Monday, August 18, 2008

Browser Inconsistencies: Part 1

I did a freelance gig recently that required me to do some HTML/CSS. Like all web tasks, one of the primary concerns was browser compatibility. After several hours of cursing at IE and Firefox, I found a good php script that reliably detected browser, so I used that, but doing that always feels like cheating, so I'm going to be spending some time in the coming months playing around with CSS and HTML, looking at where the breakdowns occur. Here's my first post on the subject.

Surprisingly, when just messing around, it took a little while before I found my first break. I put two divs, one absolutely positioned, one relatively positioned, inside a larger div. The larger div was placed relatively at the top of the screen, and given the style "top: 20%;" What happened then? I'll show you.

Surprisingly, Firefox was the one that did not register this style correctly. I'm pretty sure it is an issue, and not something laid down by the W3C. This is because when I changed the style to read "top: 20px;", both browsers behaved the same. Interesting.

Thursday, August 7, 2008

Little Bit of Good News

Apparently, my blog is the top hit for the search "add hidden fields programmatically" on Google.


Welcome everyone who's looking for how to add hidden fields programmatically. Please leave comments as to whether or not you found my entry useful!

Monday, July 28, 2008

Variable Scope Issues

Again, here's a new link to my latest project.
I think it's actually pretty fun, so I hope you try it out.

The only major issue left to tackle is the high score board. When the game ends, you can submit your score to the online scoreboard (I'm currently in first place, go figure.) It was a chance for me to play around with flash's internet communication abilities. It works pretty decently as well, if you ask me. The problem is that, for some reason, the scoreboard only shows up the first time you play it (per page refresh). What that means is that if you press the "Play again" button, play the game, end the game, and submit your score, you won't see the scoreboard. I'm completely scoobied as to why that is. The code to update the scoreboard is as such:

htmltext =;
scoreBox.htmlText = htmltext;

If you're not familiar with flash, trace is a command that outputs things to a special "output" window. Therefore, the variable should be shown on my screen. The problem is that despite the fact that I do get the correct text outputted to me by the trace call, the scoreBox.htmlText doesn't reflect what it was given on anything other than the first time through the game. I had a similar issue (in fact it was the only real major "bug" in the first iteration) with the images from the last scoop being present on replays as well, and that was fixed by making sure that everything was reset properly at the appropriate time.

My guess is that this is a variable scoping issue, partially because that was the issue the first time, and partially because Flash's scoping is about as confusing as watching Memento when you're tripping acid. As far as I can tell, variables instantiated within a frame are scoped globally, unless they're within a loop or function. However, commands, or anything else for that matter, only apply at that specific frame. This kinda makes sense. You might want to use a variable you defined somewhere else in the program, and you don't want all of your code executing at once. The issue here is, when are objects (like the scoreBox) that are defined in the scene, not in the code, created? Are they global? Can I change their values before they're visible? I'm still trying to figure it out, so if anyone has any suggestions, feel free to leave them in the comments.

As far as I can tell, it's only used for debugging, as my experiences with the actual Flash debugger were about as pleasant as shaving with a rusty butter knife.

Wednesday, July 9, 2008

Baby Steps

I think this is probably my first post ever that contains any actual coding in it, and frankly, I'm a little scared. This is such foreign territory for me, I don't know where to start.

So I'll start with a link.

Yes, it is in flash. I feel dirty with this being the first "complete" program I post, but we all have our flaws. I actually haven't paid for Flash CS3, but I'm using the 30-day free trial, and working fast. In terms of development, there are some things I like and some things I really dislike, but I'll get into that later.

If you can't tell by the URL, or the name, it's just a cheesy, quick Space Invaders clone. I programmed it in ActionScript 3, despite never having used it before, in about 3 hours. My main problem so far is that when there is contact between the lasers and their target. The lasers are (obviously) placed dynamically, and in their update method, I tried using the following line to "kill" invaders when they touch.


Which, on top of being horribly hardcoded, throws the following error.

Error #2025: The supplied DisplayObject must be a child of the caller.

I tried everything I could think of, and eventually just decided that it was just a "Flash Thing," until I realized that the lasers were being added like so:


The problem here was that when I was looking at how to dynamically add new objects, the tutorial didn't have their code in a function, it was just on the stage. Therefore, using the keyword this on the frame added for the stage, setting up for a similar removal with this.removeChild(). Apparently, when inside the function, this is not the stage. There was another issue, however.

The Invader still would not be removed via stage.RemoveChild. It continued to throw the "Child of the Caller" error. I intend to dig deeper into this, but not too much, though, since the issue is relatively moot. This is because the Invaders were added to the scene statically, which I found is unlikely to happen in any sort of "normal" game. Things that are dynamically added and removed (just as God intended,) come into and out of the scene without an issue. Not terribly enlightening, but at least it wasn't about the iPhone.

Monday, June 30, 2008

Get Ready To Grind Yourself Retarded

One of my current favorite web video series at the moment is The Escapist's Zero Punctuation. Ben Croshaw definitely has his own brand of humor, but I feel the need to comment on one of his common criticisms: grinding for inordinate amounts of time. First, a definition.

Grind (v): To repeatedly perform an identical action or sequence of actions in a game solely for the purposes of advancement otherwise unreachable.

Being forced by a game to perform the same action or actions over and over again is actually what gaming is usually about. Take any game you can think of. FPS, RTS, casual games especially, they're just an endless repetition of doing the same thing over and over and over, until you either fail to perform that action correctly a certain number of times, or you get bored/tired, and go to do something else. This isn't necessarily bad. Can you imagine a game where controls changed every two minutes, along with the perspective, goals, and game mechanics? It would be horrible and confusing. It could be intriguing to some (see: masochists,) but for most people, it would something that would be set aside almost instantly.

So, assuming that almost all games are a set of repeated actions, what makes some mechanics fun, while others are more boring and tedious than de-lousing your grandpa? Why is it that battling 500 rats or goblins to level up is mind-numbingly sloggy, whereas blasting your way through the Combine, Covenant, or NOD fun?

Pardon me for using programmer-speak if you're not into that, but the answer is shifting arguments. If you're the Ruby or Python interpreter, you don't mind seeing the same function called with the same values 500 times. You're a computer program, you don't even have feelings. In fact, why are you reading my blog? Get back to work, damnit.


As I was saying, people are not computers. We like things to change frequently, but in small and occasionally medium-sized ways. For example, most JRPGs I've played are basically big grinders wrapped in a tissue-thin container of plot. You encounter some bad guys, you have to hit "Attack" or "Magic," and direct death towards your enemies. Occasionally, there's a cool cut scene thrown in. This could be fun for a while, but eventually you know the key controls that you could be a blind vegetable (personally, I like carrots, but some people are partial to broccoli) and achieve the same results. The original Pokemon games (Red and Blue, for you youngsters) realized this, and also realized one of the potential ways to fix it. If you have so many different enemies that you only counter the same one every once in a while, it isn't that monotonous. Eventually, that will get boring, but before it does, they let you advance onto a new area with a new set of creatures to brutalize and enslave.

Causal games are the kings of grinding. In fact, it's pretty much their modus operandi. You take one concept, one mechanic, and apply it ad nauseum until the player is either comatose or victorious. Why are they still ridiculously fun (and addictive)? Variation. Bubble bobble and Sudoku wouldn't be fun if they had the exact same patterns every single time. Even before computer games were invented, we had crossword puzzles that had the exact same concept. Do a thing, change the parameters, do the thing again, etc.

So how does fun repetition become painful grinding?

  1. When you want to advance in the storyline, but can't because you have to grind some more. This is almost more of a pacing issue than it is a game mechanic issue. Of course, you don't want to just shuffle the player from cutscene to cutscene, with barely any gameplay (unless you're Devil May Cry 4, of course) but at the same time, you don't want the player to be so bored that they turn off the game before they get to the next plot point.
  2. When the challenges don't change at all. This should really be a no-brainer. It's why Postal workers used to go crazy. All work and no play makes Jack a dull boy. If you don't want your players to be bored, than switch things up a bit. Vary the enemies somehow. Difficulty, color, tactics, numbers, whatever, just try to throw a wrench in the player's grind machine every once in a while. They'll be surprised, possibly angry at first, but hey, angry beats bored.
  3. When the types of challenges don't change. This is subtly different from #2, but important. It's actually the reason why the concept of "boss battles" has done so well throughout the course of video game history. This is the "medium-sized" shift I was talking about earlier. The game mechanics can still be the same here, but there has to be a significant shift in what the player is doing with them. Instead of slicing through hordes of little enemies, they're fighting one very difficult one. Instead of building up a base and defenses, building an army and attacking, the player is given an army, and shoved out the door to complete an objective. Minor changes like the ones mentioned in #2 will hold off the feelings of grinding, but nothing abates them faster then having their usual plans rendered temporarily useless.
Whenever I design a game, or even a game mechanic, I always have to ask myself "would this still be fun after two hours?" I think that if more game designers did the same thing, then we wouldn't have to hear about how someone had to grind all night to get the items necessary to get up to level Y so they could make beef stew.

Wednesday, June 25, 2008

Why I Hate Flash

I could apologize for this being the first post in two months, but I won't. This is my third rewrite, mostly due to my incompetence, but frankly it's beginning to wear thinly on me.

I'm currently taking a game design course at my university along with two of my friends. Woo and yay. A class that actually may have some bearing on what I might actually do with my life. Don't get me wrong, but I don't see any time in my future when explaining if the character in John Updike's A&P will be required of me in the line of work. Anyways, I love gaming, I love game design, I love talking about game design, and I like several people in the class. However, after our first meeting, I feel a little down. Why? This sentence:

"2D Games will be done in Flash."

Well, there's the chandelier made of rabbit droppings. I despise Flash, and apparently most people cannot understand why, so I'm going to lay it out here, as much for me as it is for the half-dozen other people who will read it.

First and foremost, I don't like Flash because I have to pay to program it. This, if you didn't know, is a bit of an anomaly in the programming world. With the exception of Flash, there isn't a single other programming language that you have to pay to use. Even Microsoft's .NET languages have an open framework, and if you'll really itching to do some Mac-Based VBscript, you can use the (open-source, no less) Mono project headed by Miguel de Icaza.

As a student, the cheapest I could get flash for is around $250. If I wasn't a student, this price point goes up to $700. I have no problem with Macromedia Adobe wanting money for their code, that's all well and good. However, there is NO WAY to do anything in flash without using their IDE. Microsoft's Visual Studio is a great IDE. It comes with lots of awesome tools built in, and that's effing sweet. It also costs around the same amount as Flash. I couldn't find the exact price point, mostly because I was bombarded with offers for trial and free versions. There is a free edition of almost every Microsoft Developers product. They're called the "Express" editions, and though they lack the bells and whistles of the paid versions, they still get the job done. In fact, if you're a student, you CAN get free versions of the full product. Yes, that's right, you can get Microsoft products for FREE, if you're a student.

The reason for this is that Microsoft wants to get students used to using their products and their languages. It's a little sinister when you think about it, them baiting you with freebies when you're a poor college student, but Adobe doesn't even attempt to do the same thing, and basically just flips you the middle finger.

When I bring this point up, most people say "Why don't you just pirate it? Problem solved!" No, you vacuum-craniumed dolt, I'm a programmer. I want to get paid for the programs I write. If someone wants to charge for their program, that's fine by me. I love open source, but hey, someone's got to make a living somehow. If I don't want to pay for their development environment, then that should be their problem, not mine. My problem is that I cannot do Flash development without buying their product. Stealing it would be similar to a wheat farmer stealing potatoes. It's just wrong. Hell, I'm sure that half the reason Flash is so expensive is to offset the lost profits from people pirating it.

The other big pro-flash reason I get is this: artists use it. Well you know what, screw the artists. Last time I checked, a game artist should be doing artsy things, like drawing or modeling or whatnot. My job is programming, not theirs. If some beret-topped punter is going to tell me that he likes using Flash because he can program in it, that let me get my paintbrush out, cram it up my bum, and start "composing assets." Images are all the same. A JPEG in Flash is a JPEG to PyGame, or OpenGL, or DirectX. You want vector graphics? Awesome. Fireworks does SVG. I checked. If artists want to start doing programming, then I might as well go fly helicopters, because apparently I'm not needed anymore.

As vitriolic as I'm being, there would be one simple thing that Adobe could do that would make them a-ok in my book. Don't sell the platform, sell the IDE. If I can program Flash using a text editor, I'd go for it. Eventually, once I'd made some cash, I'd see the benefits to using the IDE, and purchase it. Having an up-front cost is nothing but an unnecessary barrier that is a relic of a bygone era.

Also, I'm hoping my artist friends take my slurs against them as being all in good fun. ;-P

Thursday, April 17, 2008

Gettin Things Done

I went to a meeting for my university's local game dev group yesterday. The discussed topic was "What To Do Break Into The Game Industry." There were some helpful hints as to languages, networking events and the like, but one point that the speakers drove home (and I felt was VERY important) was this:

Make Games

It's so simple, but a lot of people, and not just those in the game development field, have a problem with it. Not just games, but all sorts of programs. In school, we usually just make some throwaway scripts that we use once for an assigment, and never look at again. Knowing how to do stuff is very important. Actually doing stuff, that's a whole nother level. If you go into an interview and say "I know how to do X,Y, and Z," they might be relatively impressed. However, if you can say "I did A, using X and Y, but I didn't get a chance to implement Z yet," that is umpteen times better than the first one. Companies don't care if you know how to do stuff. I know the basics of how to play baseball, that doesn't mean that the Phillies manager is going to be beating my door down trying to get me to join the team.

The problem with this is: making stuff can be tedious, and boring. For every cool bit, there are probably ten times that much of boring stuff that you have to slog through before you get there. It's hard to strike a balance, especially if you're like me and hate doing work in your free time. It's boring, but everything's boring if you do it enough. Try to add new stuff to the boring bits wherever possible, and you'll be okay.

Most importantly, though, do stuff, don't just learn stuff.

Wednesday, April 9, 2008

Randall Monroe Is A Wise Man

"When designing an interface, imagine that your program is all that stands between the user and hot, sweaty, tangled-bedsheets-fingertips-digging-into-the-back sex."

Yes, it's a little more than we'd actually want to think about our users, but replace the more graphic bit with "leaving work on-time", "playing with their children", or "having the information they need before the big meeting", and it does make a person realize an important fact. Unless you're making a game (and sometimes even if you are), your user could care less about your program. They want to interact with their data, or someone else's data, or data in the "cloud." Your program? They could take it or leave it, depending on the feature set, speed etc.

Your program will often actually be getting in the user's way.

Think about how much crap Microsoft took when they had Clippit, the annoying talking paperclip. Some people still make jokes about that goof, but when you think about it, it's a feature that kind of makes sense. An intelligent help section that assisted you with whatever task you were attempting to complete. Users don't like it when you meddle with them. They might be doing something horribly wrong, (or possibly just differently from how you'd do it) but don't meddle with them. They want to be left alone. If they want help, they'll find it.

That being said, I'm not trying to make a case against unobtrusive guides. If there is a somewhat confusing section of your program, it might make sense to include simple instructions on how to use it, or what type of input format is best, etc.

Remember, your users have better things to do. ;-P

Tuesday, March 25, 2008

Open-ness: The Business Practice of...Tomorrow

I could explain why I haven't posted in a while, but I'm not going to. Anyways...

I still recieve the dead-tree version of Wired magazine, even though with how long it takes to get to me all the content is online, anda few of their stories really got me thinking. In one camp, there are the articles about how open-ness an freedom is the "new wave" and how businesses are doing well by "opening up." See this article for what I mean. However, this month's cover story (not to mention a slew of similar articles throughout the internet and traditional media) is on Apple's successes in the last couple of years, despite the fact that they're bucking the trend by being as "closed", or as I prefer "evil" as a tech company can be. Sure they're realeasing an iPhone SDK, but with enough stipulations that you basically have to be having your hand held by Apple the whole time if you ever want to do anything.

While I was thinking about that (my rants on Apple are usually more bile-filled, and much longer, consider yourself lucky,) I began thinkning about another tech company that the mention of doesn't fill me with seas of bloodcurdling rage: Nintendo. Not only are they completely locked down in terms of hardware and software, but their adherents and fanatics closely mirror each other when it comes to the amount of devotion said groups express. Everyone gives Sony a bad rap for being so proprietary (which is justly deserved) , but when you think about it, Nintendo is even worse. Their newest-gen console doesn't even play DVDs, for christ's sake. Imagine how (much more) awesome the DS would be if it took SD cards, played Mp3s/Oggs, and had an active and egaging (and supported) dev community. Yeah, there are "hacker" groups that do some cool stuff with it, but when you come down to brass tacks, if you want to make anything for the Wii, or the DS, or the PS3, or anything uncrippled for the 360, be ready to pay through the nose.

My point is thus: Open technologies are awesome, I love them, I can't imagine making anything nowadays without assuming it would be open somehow. HOWEVER, you can still make big bucks by staying closed, and until someone (I'm looking at you Google/Android,) can prove that being "closed" is going to lose to being "open," there isn't going to be a major shakeup anytime soon.

Wednesday, February 27, 2008

The Desktop Toolbar: Almost Better Than Free Cheesesteak Mondays

If you're using windows (and according to Google analyitcs, you most likely are,) and you're like me, your desktop has a lot of stuff on it. Not clutter, no, these are important things that you use often, like web browsers, ftp/ssh clients, and files/folders that have important information that you use often. Well, that, plus all the other junk you just saved to the desktop because it was the default path. Either way, that stuff's really useful, right up until the point when you hit "Maximize." Then its all hidden, and the only way to get to it is to minimize your (sometimes many) windows to get back to your desktop files/programs.

Unless, of course, you were looking at the toolbars included with Windows XP/Vista. If have tried out the interesting-looking ones (like I did,) you'd find this.


Not interesting? Well, clicking the innocuous double-arrows will set YOU straight.

Yeah, that's right. All my Desktop files and folders and everything. Right there. Freaking sweet. No more Windows-D, arrows to folder, Enter, then Alt-Tabbing back to my workspace. None of that. Just click, click, done. I love it. The only thing that could possibly make it any better if the word "Desktop" wasn't there. Maybe just a logo, to save space. Either way, I love the Desktop Toolbar, use it.

Monday, February 25, 2008

Playlist Loading/Shuffling

Sorry it's been so long since my last post. I took a trip up to New York for an OpenSocial Hackathon (which I'll discuss in a later post,) but I just had this thought mulling around my head and felt like writing it down.

I probably use my media player (WMP, Songbird, AmaroK, etc) more than any other program on my computer. No matter what I'm doing (coding, web-surfing, doc writing, or all three,) I'm usually listening to music. I'll pick a playlist, either from my list of pre-made playlists, or just throw one together at the drop of a hat. However, I really hate how most modern shuffles absolutely fail at "shuffling" my songs. On my n800, GP2X, and ipod before that, I could tell that there was no list order, but that the player literally just selected a random song from the list, and played it. Yeah, it's fast, but it's lazy, and kind of infuriating. A few days ago, on WMP, it actually played the same song twice. As far as randomness goes, that's technically random, but I don't like random, I want shuffled.

When you're playing a card game, you don't reshuffle the deck every time someone puts a card down (though that could really screw up your magician friend's magic trick.) No, you wait until the deck has been fully uncovered (or the hand ends.) 

Along with that, even though WMP loads a playlist, it "shuffles" it, but it's pretty lazy as well. That's why when the playlist ends, I coudl theoretically listen to the same song twice. Even along with that, though, sometimes it'll get into blocks where I hear all of an artist's songs right in a row, which is frustrating as well.

I don't have any code at the moment, but it would really be nice if there was a good shuffling algorithm that made sure that artists' songs were kept spread out, prevented playing the same song close to itself, and, in general, made my most commonly performed task a little bit nicer.

Thursday, January 24, 2008

The MacBook Air...Yeah...

Unless you've had your head buried in the sand for the last two weeks, you've heard about Apple's new MacBook Air, the super-slim notebook that apparently looks really cool. Apparently there were some features left out in order to slim it down. Such as...

  • Normal port set, including Ethernet
  • An optical drive (unless you want to use RemoteDisc, which is still pretty useless)
  • A changeable battery
Now, I'm going to say that I'm not a huge Apple fan. I think that Steve Jobs is an all-around jerk, and their products are vastly overpriced, and not anywhere well-designed enough to warrant that cost. However, I am thinking about getting a Mac for my next laptop, so I can triple-boot, and code for all three major platforms. However, the Mac I'd want to get is currently out of my price range at $2500, so it might be a while before I act on that. In fact, my current laptop, a refurbished Averatec 2260, which cost me 8 hundo, is running great. Plus, it has an ethernet jack, in case I go to my dad's house. Oh, and a DVD burner, so I can watch movies I buy, or make backups of my data. And I forgot the SD-card reader, which is really helpful now that I have a Nokia n800 (which I LOVE, by the way.)

One thing I do find HILARIOUS though, is that the last time I used a laptop that required a usb-connected drive, it was in 1992-3, my mom's old IBM, whose model number I can't even remember, but boy was it a clunker. It is counter-intuitive, though, that to use the world's "most portable" laptop, you've got to schlock around another attachment. That might just be me, but I'm pretty sure that the industry started putting drives in laptops so that people didn't have to carry around all that extra crap.

To be fair, there are ways around some of these issues. I haven't seen anyone try it, but there are USB ethernet adapters, which could solve the wired internet problem. As far as the Remote Disc issue goes, that's Apple's problem. My guess is that the next version will fix the music/dvd reading issues. Didn't anyone learn anything from the early iPhone adopters? Apple has a habit of screwing over it's most loyal fans, good thing they like it.

However, I had an "aha" moment when thinking about the battery issue. Sure, you can't change the battery yourself, and if you run out, you're scoobied. Unless Apple has the same idea I had. The Mac power adapters are already pretty huge, though I'm guessing the Air's is smaller (I haven't seen it yet.) So, what should the Jobspire do? Slap an external battery into the power adapter! Charge the laptop's battery first, then the external battery, and if the laptop's battery is going to die, and the user doesn't have access to an AC outlet, they can still plug their power adapter in, and run off the battery power there.

In general, like all of Apple's products, the Air is ridiculously over-hyped. I'd feel bad about giving it more coverage, but I don't get enough traffic to feel guilty about that. It's a one-trick pony, and that trick gets pretty boring real quick when you can't play a DVD.

Wednesday, January 23, 2008

I Hate Scrollbars

I have a big post I'm working on about the Microsoft Sync Framework, but I'm working on some UI mockups right now, and realized that I have something to rant about.

In general, I HATE scrollbars. I hate including them in my UI designs, because in my mind, they represent a failure to display all the necessary information in a neat, compact manner. Don't get me wrong, in many instances, they're perfectly acceptable, but with what I'm working on (rich interet applications,) if your user has to scroll, and it's not because of their content (and sometimes even if it is,) you've probably messed up somewhere. The extra effort a user has to exert by scrolling better be worth it, and in many cases, it isn't.

Prime example: Blogger. My browser is fullscreened, on a monitor with a resolution of 1280x1024. There's tons of whitespace to the right of and below the edit window. However, by the time I finish this line (* right there, actually,) I have not only a vertical scrollbar, but a horizontal one. Google has all sorts of Javascripty-Ajaxified coolness on this page. Why can't they resize the editing iframe? Or wrap properly. 

Part of it is the inability of developers to come up with interesting ways to display information. That inability is partially from the fear that users will reject what is new and strange. That's bull. When I first started using Office 2007, for about the first week I had a hard time adjusting to the "Ribbon" UI. Nowadays, I can do what I need to a lot faster, and Office 2003 does nothing but remind me that sometimes Microsoft products DO get better with newer versions. Good UI will keep it's users, even if it's initially a little bit different.

That's the end of my first real rant, I'm sure there'll be more to come.

Friday, January 18, 2008

Open Gaming

One of the things that's always bothered me about video games is that they're usually very restrictive. You have to have X platform, Y controller, give money to Z publisher. Personally, I look at the Xbox 360 and PC (in particular,) and say "why can't they play games together." I bought Team Fortress 2 for my PC, and can't play with people on XBox live. I know a solid bit of that is because Microsoft (and Sony, and Nintendo) want to make more money, but as the consumer, I feel shafted.

I had an idea the other day, along lines similar to this: make a game that everyone can play. Publish the game state data as an xml feed. Allow platform developers to make their own interfaces to the game, and have them send responses back to a server, which can be either dedicated, or one of the participants of the game. You'd have to worry about people cheating, but it my mind most of the actual processing would be on the server side. That way, even thin clients or browser-based interfaces would be able to play in the same game as a $4,000 PC, running a DirectX10-based interface with HDR and all the bells and whistles.

I don't know, this may just be a pipe dream, but I think that at the very least it's an interesting concept.

Tuesday, January 8, 2008

What I Learned About User Experience Over Winter Break

I know I've been bad. The holidays, projects I shouldn't be blogging about, and my shiny new Xbox 360 have been the major contributors to falling off my "one post a week" horse. Either way, I should be back on said horse, and riding back into the Wild Blogtier. Let's start rustling.

Since I'm still (technically) a student, many of my fond memories are from my younger years, where homework that took 1/2 an hour was epicly long, girls were icky, and corn dog night was just about equivalent with Christmas or losing a tooth. One of the other things I recalled was the ever-popular "What I Did On My ___ Vacation" essays. In my quest to find shiny things for my girlfriend's Christmas present, I learned a valuable lesson...about user experience.

I live in Philadelphia, and there's a section of Eighth Street called "Jeweler's Row." That section (if you haven't guessed already) has lots and lots of  jewelers on it. It's the logical destination for any hapless (see: male) potential jewelry buyer, since you can find pretty much anything you want there.

I looked into a couple of the smaller ones, since I try to support local businesses over large chains, but they all looked pretty shady. If I'm spending a significant amount of my cash on something that does nothing but look pretty, I tend to want to know I'm not being taken for a ride. So I went to a large chain store.

The salesman was attentive and friendly, and even offered me a cup of coffee when I came in. I told him what I wanted, and what my budget was, and he told me that what I wanted (garnets) were not in high supply (which was odd, considering that they're January's birthstone) so he could get some for me, and set them himself. That seemed fine to me, so I agreed to call a few days later to check on the progess.

I call, he's sick, call back on Monday.

I call again, still sick call on Tuesday.

I call again, he's still sick, call on Wednesday. It's the 18th, and I'm leaving on Friday to visit family. I ask if I can get anyone to help me. They say no, because it was his sale. I tell them to tell him not to bother, because I'm going somewhere else.

I try another large chain, but apparently they only had diamonds. So much for that.

Finally, at my wits' end, I try one of the hole-in-the-walls that was suggested to me by the salesperson ad the diamonds-only place. Lo and behold, I walk in there, tell them what I want, what I want to pay, and they say "okay, we'll have it for you on Thursday."

The lessons that I learned here go something like this:

1) Jerk the consumer around (even non-purposefully,) and they'll leave you.
2) If a customer's only contact is unavailable, and you won't provide another, you'll lose them.
3) If you can't provide what your customer wants, point them in a direction they can use. Then, they might come back to you when they need what you do provide.
4) Appearances are nice and all, but in the end, what matters is if you can deliver the goods or not.

And that is why I think the T-Rex was the coolest dinosaur ever.