Thursday, July 17, 2008

Why you shouldn't be a developer

So, I just got through telling you to learn how to write a bit of code. Maybe you tried it out and thought, hey, this is easy! Developing is easy money!

Stop. Just... Stop.

Coding, like cooking, is easy to do when you're not doing much. Anyone can boil an egg, or slap together a cake from a mix. A developer worth his or her salt isn't doing that.

Coding is more than knowing python or C or Java. It's a way of thinking. You have to be able to break things down into their component parts. It's being able to put parts back together again in ways that still make sense.

The best developers I know may prefer one language, but if you held a gun to their head, they could pull a few more out of their hat. If you handed them a language they'd never seen before, most could get up to speed fairly quickly if they needed to. We have our favorites, but the dark truth of the good developers is that if you needed them to program in their most hated language... they probably could. Hell, they see career development as being able to take languages OFF their roster, and still land good jobs.

Back in the dot com heyday, I was in the 'make it or break it' course for computer science: Data Structures. It was required by all students, and was a requirement for all classes after it. You didn't pass it, well, there was always Communications... I made the acquaintance of a guy who was struggling. Badly. He didn't understand object oriented design, or design patterns, or reuse. He could write C++, but he couldn't think in it. All semester I babied him in exchange for free food and sodas, and somehow, by some miracle, he passed. He memorized all the procedures and right answers and, in Rainman style, spit them back out at test time.

The prof saw how much he struggled, and on the last day, pulled him aside.

"Why do you want to do programming?"

"Um... money?"

"Fair enough." He lowered his glasses and gave him that look they teach you in professoring school: the kind that bores through your skull.

"I hope you like Ramen."

He switched to communications.

Tuesday, July 15, 2008

Why you should learn to write code

Yes, you.

There hasn't been a job yet in my repertoire that wasn't improved by a bit of code. Oddly enough (or sensibly enough, as you'll see) the parts that weren't improved weren't the more cognitively demanding areas of these jobs. It was the dull parts where ten lines of script helped me out beyond belief.

Real life example: we have to upload our code for every project into a repository that resides offsite. Due to a few business rules, we can't just ftp files over, or give them SVN checkout permission to our servers. Not a huge issue, really, since code bases are usually pretty small.

Cue the day I handed over a product that, due to images, video, and flash, was over a gig.

  1. It belonged to the project with monthly releases.
  2. And each release had to have it's own directory in the server.
  3. Did I mention that there was a file size limit for uploads?

I really thought the code manager was going to cry.

We came to the compromise that I would do deltas for him. Each month, I'd bring over only the files that had changed, with file structure preserved. That way, if they did have to pull the code out and redeploy it, he could just cut and paste the latest delta over the first release and hand it over. Doing this by hand? A bitch. Doing it with python and subversion?

A snap.

I had SVN create a file called diff.txt that gave me a list of every file that had changed between this release and our first. Then I wrote a python script that either copied a file into a new folder if it had been added or modified, or made a note in a file called deleted.txt if it had been deleted.

Writing it took me a few hours. Running it takes minutes. Every release, it saves me god knows how many hours of cutting and pasting.

You don't need to make megapiles of script for code to be useful. My useful script, including a probably over done open and read file function, was under 50 lines.

I started off with the advantage of having learned how to program in high school and a bit in college, but I do believe anyone can learn to code. Python is a great place to start. It's light, fast, free, and has a strong community. A few places to start:


I'm a lady of my word, so here's the code:
import os
import shutil

def openFile (filename):
"""Given a filename, read the contents of that file.
"""
f = open(filename)
line = ''
a = []

while 1==1:
line = f.readline()
if line != '':
if line != '\n':
a.append(line)
else:
break
f.close()
return a

def moveFiles (array, base_path):
'''For each different file:
1. Make sure that the directory structure is there.
2. Copy the file over.
'''
deleted_files = []
for line in array:
files = line.rsplit()
if files[0] != "D":
orgfile = base + files[1]
files[1] = "deltas/" + files[1]
newfile = files[1]
newfile = newfile.rsplit("/")
newfile.pop()
dir = ""
for folder in newfile:
if dir == "":
dir = folder + "/"
else:
dir = dir + folder + "/"
try:
os.makedirs(dir)

Sunday, June 22, 2008

UI's - Full Circles

In the beginning, there was the command line, and we were grateful for it, darn it. A single blinking underscore in orange or amber or green (depending one what strange theory was vogue about eyestrain at the time) was all we needed to get our computers to do what we needed them to do.

My first foray into computers was during this time. I cut my teeth on the Commodore 64, a beige box filled with strange churning sounds that allowed me to program and play games made simply of words and my own mad stabs at the game's internal dictionary. A few of my games had images, but nothing that could be called an interface, per se. They were usually badly pixilated depictions of trolls that were trying to crack open my skull before I saved the maiden fair.

Then came the GUI. Graphical user interfaces (GUIs) first appeared for the general public in the Apple ]['s. Instead of having to remember commands like CD (change directory), LS (list all files), or PS -A (list all processes), you found it in the interface. Instead of CD MYDOCU~1, you clicked on "My Documents". Instead of LS -v, you simply looked at the contents of the folder, displayed in a handy box. PS -A? They had a handy tool for that. The command line as a common tool was dead.

Where GUI's removed obfuscated commands from the user experience, they ushered in a way that was easier for the average person to understand. People outside of the usual circle of technophiles started to use computers. Businesses brought them in, as it no longer took a separate degree to be trained on then. The machine, created by us, had started to change us, how we did work, how we thought about computers.

GUI's were not without problems, though. Problems usually arise when engineers have created the interface. The melt-down at Three Mile Island was the result of an engineer doing this. Two dials that had to be monitored simultaneously were placed on opposite ends of the room. If there were two technicians on hand, they had to yell out the readings to each other. If only one was on duty, he or she had to run back and forth from one dial to another. To an engineer, where the dials were placed made sense. It simply didn't make sense once you added the human element.

While a faulty interface in a computer program never lead to nuclear melt-down, it can lead to endless frustrations to even the most technical of users. In the early days of graphical programs, standards hadn't been settled upon by the various competitors, with good reason. Each hoped to get the customer so used to their style, that they would never make the switch. A famous rivalry was that between Microsoft and Corel. Both produced a set of expensive office tools used for document and spreadsheet creation. The core features were nearly identical. In other arenas, the battle for dominance might have taken place in advertising, with clever slogans and campaigns to keep users hooked. With Microsoft and Corel, however, the battles took place in the UI. They moved around common elements, such as word formatting and options, just enough to make the switch daunting to the average user.

Once the initial rush to claim users was over, standards began to evolve. We based icons and organization off of things in our everyday life: folders to hold things, magnifying glasses to look for things, images of disks to indicate saving things. As time passed, users and developers began to agree on standard icons and terminology. An image of a disk was used to indicate the ability to save what you were working on, not to open a saved item.

Having an interface, though, started to change the physical appearance of the computer. Monitors, in the days of amber screens, didn't have to be high resolution, or even very large. A nine-inch monitor wasn't unheard of. Color was rarely necessary, and when it existed, it was usually only in the most basic of primary and secondary tones. An interface done only in blocky ROY G. BIV is painful to use, so higher resolution monitors started to become the standard. In order to render these higher resolutions, better processors became necessary.

Not only were existing elements added to, but new pieces of hardware were added. The mouse became absolutely vital to navigating a visual interface. Where before you found your way in a file system semantically, by file paths, you did it visually, by remember what you put where. Instead of keystrokes, you had clicks.

A funny thing happened with adding the mouse. People started complaining that their wrist hurt. The first reactions, in the mid- to late-eighties were derisive. "Mouse arm" became a running joke around many agencies. To those that had developed it, it wasn't nearly so funny. We were used to the idea that sitting in a chair for eight hours straight might hurt our backs that evening, but it had never occurred to us that using something might actually damage us. Ergonomics was beginning to enter the scene. Gel pads found their way onto desks, mouses were made to fit the hand better, and studies were conducted to see what the optimal positions were to make sure we weren't hurting our backs, hands, or eyes. As for those who had already developed what came to be known as carpal tunnel syndrome, many resorted to braces around their wrists. When that didn't help, they often had to resort to surgery.

In the past ten years, a curious thing has been happening with interfaces. The humans using them are beginning to push back. First, engineers thought they knew best how to display data. Then it was human factors psychologists. What became clear, however, was that the user wanted to be able to define their own experience. Applications with a static tool bar began to lose favor, as those that gave the user the most choice in where elements might be displayed, and which ones they wanted to toss totally.

The first time I saw a custom interface was when I was introduced to the Plus! package from Microsoft. It seemed a cute way of customizing the way your desktop looked, linking background, icon sets, and color schemes. As I looked around for new themes to install, I found the usual gambit of cartoon characters and puppies, but I also found something interesting: themes based on working. One used a background where you could sort files and folders into color-coded categories. File types were color-coded blobs that were easy to find in a visual scan.

As the years passed, I noticed more products coming out that allowed a user to customize their experience. Products like Confabulator and Google Desktop not only allowed a user to change how their desktop looked, but what was displayed there. Little portlets could display the weather, headlines from a news source, or the latest entries from a blog.

Up to this point, customization seemed limited to serious applications, like word processors and spreadsheet managers. A few less-serious areas had grabbed onto customization technology, like RSS feeds and blogs, but things like games remained locked into whatever a designer had decided back in development. This all changed with a game called World of Warcraft.

World of Warcraft is a massive online game (MMO), where people level up avatars by killing rats, then ogres, then demi-gods (with a few steps in-between, naturally). It wasn't the first to do this. Earlier games, such as Everquest, Ultima Online, and Dark Age of Camelot worked along the same lines, and each had a reasonable player base. Warcraft came out, and sales sky-rocketed. People not only bought it, but played it, and kept playing it.

My husband had to talk fast to get me to play another MMO. I'd left the last one in disgust, and swore never to play another one again. He assured me that Warcraft would be different. After installing it, he went to work on my interface. Blizzard, Warcraft's creator, had opened up a tool-set to allow users to create custom interfaces for their game. Users then turned around and posted them. I was able to install products that allowed me to see information that I wanted, how I wanted it. I was a damage-dealer, so I wanted data on how hard I was hitting. I could get that in floating numbers as I hit something, then as a print-out after a fight was over. My husband wanted a map that he could mark up however he wanted, noting everything from where he found cool things, where neat views were, or where a group of us were meeting up.

While advertising and buzz got people to the store to buy the game, it didn't make them continue to play (paying a monthly fee all the while). The other games had content. They had dragons and neat gear to wear. What they didn't have was the ability for the user to have control over what they saw, and how they experienced the game.

One intriguing result of the add-ons was how they began to influence the game itself. As more dungeons were created, more encounters were not only made easier by the add-ons, but seemed to require it. One popular modification was Decursive. When a person in your group became cursed, certain classes have the ability to remove that curse. Before Decursive, this took constant scanning. With the mod installed, a box would pop up showing the affected character. Click the box, the right spell went off, curing him or her. After Decursive became popular, the dungeon designers at Blizzard started adding in creatures that, instead of sending out curses one at a time, would affect the entire group or raid. Curing them all would be impossible without Decusrive installed. The interface was now not only changing how the user interacted the game, but was changing how that game was further developed. Not only were the humans pushing back, but the machine was responding.

It has taken time for designers and engineers to let go of the idea that they know what the users need most. As our capabilities grew in designing interfaces, studies grew, trying to discern how to capture the attention of the most users. Were animations helpful, or harmful? What colors were best for indicating something was a link? What part of the page is best for getting someone's attention? How can we affect how much a user comes away with? Any time a study tried to answer one of the above answers, the researchers usually came away with an option that was strong, but certainly didn't cover the entire subject pool they had studied.

The recent release of Google personalized web page works off the basis that one answer will not suit everyone. Previous portals, such as Yahoo's portal circa 1998, only allowed a set number of items to be shown, and all had to be on the same page. With Google's portal, users have the ultimate flexibility: they can choose content, placement, and even group items in ways that make sense to them. Users can even create their own custom portlets, then share them for others to use. In my office, most of my coworkers have Google's portal as their homepage, but everyone uses it differently. One groups different news sources on different pages. Another keeps track of blogs and local events. I have weather, comics, and a few news feeds to keep me current. When I was a user of Yahoo's portal, I knew of almost no other users. Now, everyone I know seems to use some variation of Google's homepage.

The cycle of us pushing technology is showing signs in one area: it's encouraging people to become more technical in order to get what they want. While most will never pick up a programming language, more people every year seem to know what an RSS feed is. For those that do know how to program, user communities are expanding for popular products, like bulletin board software or content management systems. Ten years ago, most were computer science graduates, or those that had been in the industry for years. Today, online guides and "Dummies" books let nearly anyone learn to code. Today, communities are made of professionals, but also those who only picked up a book when they wanted their board or CMS to do something, but couldn't find someone else who had done it already.

Indeed, in a few small ways, we're almost coming full circle. I was in one of my clients' offices a few weeks ago. He wasn't the most technical of customers. Though brilliant, he had trouble with his laptop on a daily basis. I was there to find out why syncing was taking him so long.
"Can you bring up your task monitor? Go to the Apple icon--"

He cut me off. "Oh! I found another way you can do that!" He opened up a command line terminal, then pecked out PS -A. He hit enter, and a list of his current processes popped up, complete with how much they were eating at his processor. "Isn't that clever?"

"Boy," I said, "Wait until I show you grep."

Monday, June 2, 2008

Email doesn't work

There is one thing that annoys me more than anything when on a team based project:

Email.

Not volume. Having a Blackberry and some smart filters helps cut down on volume immensely. It also doesn't hurt that most of my team is familiar with the correct way to write an email, either by design or by personality. No, my problem with email is that it's not designed for team communication.

1. It's always hidden.

Email, by default, is only available in two places: the sender's outbox and the recipient's inbox. Barring bored sysadmin, only two people know of any email's existence: the sender and the recipient. If a concurrence went through, or estimates on hours, or the secret panic word that means that monkeys are loose on the fourth floor, the rest of the team is reliant on those two to spread the word. Spreading the word, if it's done, can lead to a layer of misinterpretation and obfuscation of a trail of logic. At the end of the day, you're late, over budget, and monkeys are jamming the printer and making a mess of the coffee supplies.

2. It's not reliable.

We have a problem where I am with mailboxes always running out of space. People love to email back and forth documents (oh, more on that later!) leaving me with .pst's that hover around a gig. Keeping them cleaned out helps, but there are days when you're tooling around with customers, and someone has decided that the weather is perfect for a pdf storm. By the time you're back to the office, you realize that your blackberry has been quiet the past hour, not because everyone has decided to take you off their cc's, but because your email box looks like the "Gluttony" guy from Seven.

3. It's a good excuse.

Does anyone else grit their teeth when they hear someone say that the email they sent out three weeks ago was 'caught by the spam filter'? I'd love to see a study where how often that was true was tested. Or that another filter 'went rogue' and started sorting emails in random folders. I hate that I have to follow up emails with a desk side. This defeats the purpose of an electronic format.

4. It's not a collaboration system... but people think it is.

Documentation fairies are quite familiar with having to work on documents collaboratively... sometimes not on purpose. There are times a supposedly locked document goes out for review, and comes back completely altered.

"I made some edits for you! :D :D :D"

That's not so bad, if you only have one document out in the ether. If you've sent it out for ten concurrences, though...

That's a lot of smilies. :(

5. It's not a backup... but people think it's that, too.

I know people that email themselves a document as a back-up. Or they send it to someone else, assuming that person will hold on to it for them.

These people have access to network drives. And SVN. And file drop systems. All of which are backed up.

This blows my freaking mind.

So what to do?

That's actually my next project. Here are my goals.

1. Get communication out of the inbox and somewhere public.

A website is the preferred venue, since the main goal of the web is to share, and there are about 10,476 products out there to help groups of people share. Also, once an item is up, it's up. We can see it's up. We know that other people can see that it's up. No more lost, eaten, or ignored emails.

2. Utilize workflows.

I hate workflows in power points. To me, a workflow should never rely on people remembering a scrap of unUML. If you're going to go to the effort of making the flow, go ahead and feed it to a machine that will tell me what to do when my section of cogwork is done.

3. Teach the tech-scared versioning tools.

I have taught graphics people to use SVN. There is no reason on ghu's green earth that I cannot teach anyone, up to and including my seven year old, Subversion. Plus, I really like the name. I'm hoping they come out with a sister product one day, called Perversion.

I'm hoping we can return email to what suits it best, at least in my small section of the industry: a great little communication tool, rather than a project manager and archive system.

Wednesday, May 21, 2008

Whoa - Embedded Google docs?

Am I the last person who found out you can embed your Google doc presentations into your website? I thought I knew all the cool tricks!



This was all I had on hand that's worth putting up... and only makes sense if you know all of those psychologists and have read the book Reaper Man by Terry Pratchett.

Sunday, May 18, 2008

It's not hard. Really!

Along with all the analyzing of business I do, I also bake. Get a room with more than five people in it, and I'll get the itch to put something together. It feels very odd for me to come to a shindig without a covered dish of some kind.

People don't bake as much as they once did. This leads to some surprising statements when I bring something in, like my cinnabons.

"Wow, that must be really hard."
"Actually, no, it's pretty easy. I can give you the recipe if you--"
"Oh, no. I could never do anything like that."

You know where else I hear this?

When I tell people that I can read code.

A surprising number of PMs/Designers/BAs all proclaim they could never learn to read code. I say they weren't in my CompSci classes eight years ago.

True, back then, I wouldn't have been asking that more people learn to code. I tutored a guy that had me tearing out my hair. Still, he got basic concepts. Sure, his code threatened to crash our iron clad unix server, but he could tell a for loop from a declarations statement.

It doesn't take much to learn. A community school level 101 class would bring most people up to speed on basic structures, good practices, data types, and common syntax. Even someone with enough chutzpah and a good Dummies guide could get brave enough to look at some code in the wild.

I think the fear of looking at code comes from the same place as the fear of baking. No one expects that your first cake is going to look like something from Ace of Cakes. I don't even really expect it to come out of the pan properly. First cakes are meant to be iced and eaten straight out of the pan, leaving you, the kitchen, and your sense of what a cake is in total disarray.

The first time you read code makes you feel a bit odd. It's the dissonance between knowing that you're looking at gibberish, and yet it's doing something. Something strange and mysterious and not meant for those who see daylight, and yet it's starting to make sense.

No one really expects you to start slinging your own code at this point. No one wants you to be Gates or Jobs or whatever guru people are wanting DNA samples of these days. Just some basic comprehension.

I promise you two things:

1. The sudden desire to make a proper cake will not turn you into Martha, demanding a clutch so that you can raise your own chickens and therefore have fresh eggs for your cakes. Because what the hell.

2. The sudden insight into the craft of the coder will not give you a stronger affinity for Cheetos. Because some of them really prefer beef jerky instead.

While we're at it, let me put up my recipe for cinnabons. Take some time, but most of that is sitting on your rear, reading blogs and drinking coffee.

EAC's cinnabons

Ingredients:
  • 1 (.25 ounce) package active dry yeast (seriously, those little packs you can buy at the store. People use those!)
  • 3/4 cup warm water (110 degrees F/45 degrees C) (Tap water that's almost too hot to hold your fingers under it should do.)
  • 1/4 cup white sugar
  • 3/4 teaspoon salt
  • 1 egg, room temperature (Seriously. Take the freaking egg out when you start)
  • 2 1/2 cups bread flour (Not white. Bread. King Arthur is best, but good ol' Pillsbury will do).
  • 1/4 cup butter, softened (if you forget to take it out, put it in the microwave for a minute on 30% power)
  • 1 tablespoon ground cinnamon
  • 1/2 cup brown sugar (I use dark, because it stores better.)
Directions:

  1. In a small bowl, dissolve yeast in warm water. Take out your egg and let it get to room temperature. Seriously. Then let the yeast mixture stand until creamy, about 10 minutes.
  2. In a large bowl, combine the yeast mixture with the sugar, salt, egg and 1 cup flour (that's right. Not all the flour); stir well to combine. It shouldn't look horribly lumpy.
  3. At this point, if you're using a stand mixer, and you have a hook attachment, switch to that. No foul if you don't have one, but they're easier to clean.
  4. Stir in the remaining flour, 1/2 cup at a time, beating well with each addition. When the dough has pulled together (looks like a lump), turn it out onto a lightly floured surface and knead until smooth and elastic, about 8 minutes. Truthfully, I don't usually knead it that long. But it's good exercise, so what the hey. Don't be shy about adding more flour, if the dough feels sticky and is being fussy. Dough can be like that.
  5. Cover with a damp cloth and let rest for 10 minutes.
  6. Lightly grease an 8x8 inch square baking pan. Roll dough out on a lightly floured surface to 1/4 inch thick rectanglish shape. Smear the dough with butter and sprinkle with cinnamon and brown sugar. Roll up the dough along the long edge until it forms a roll. Slice the roll into 16 equal size pieces and place them in the pan with the cut side up.
    1. Easiest way to get 16 pieces? Cut the log in half. Then the half in half. The one of the quarters in half. Then one of the eighths in half. Recurse your way up. Don't try to cut sixteen starting at one end and going to the other end. Not even cyborgs do it that way.
  7. Cover pan with plastic wrap and refrigerate overnight or cover and let rise at room temperature until doubled in volume, about 30-45 minutes. I am impatient. I do it at room temp for 30 and then cook it.
  8. Preheat oven to 350 degrees F . Bake rolls until golden brown, about 17-20 minutes. Watch them! We're going for golden brown, not unfortunate mass of bubbly blackness.
  9. Take them out and icing them. You can use canned icing, nuked for 30 seconds. I never have that on hand, so I used 1/2 c powdered sugar, 1 T vanilla extract (or vanilla rum/vodka), and 2 T milk. It's not an exact science.
If you can make those, I swear, you can grok python.

Sunday, May 11, 2008

Bit by bit, inch by inch

As I've mentioned before, I'm a business analyst with a bit of a technical background. This is to my advantage in one area in particular:

Prototyping.

I honestly don't know why industries don't do this more. Car companies do it, as do clothing and food companies. Do up a rough, hand it over, and see what comes up.

With some segments of the software industry, though, it seems the norm to only let the customer see and touch the product in five to ten minute increments, or through a series of nicely printed screen shots. The fear is that the customer will (dun dun DUN!) change his or her mind.

You know what? They do that anyway.

The other worry is that the customer doesn't really know what they want. While this is true, it's not unusual to get a customer that will have that feature, come hell or high water.

It reminds me of my son and video games. Like clockwork, Hollywood churns out a new kids movie, then a new movie themed video game. Without fail, about 90% of the spin offs are horrible, and without fail, JT will ask for every one of them.

What we have now is a subscription to a game rental service called GameFly. When he starts pining, we put the game on our queue. One it's in, we can watch to see how often he picks up the game. Some are duds, offering no more than a few minutes of entertainment (Cars for the DS was in that category). Others are too hard, requiring more than the occasional help from a parental unit. Others are just... odd. Happy Feet has us scratching our heads.

Yet he wanted, nay, needed them all.

At least with GameFly, we can test things out. If it's a dud, back it goes, with us none the poorer. I've started doing something similar with customers.

Instead of a long sit down where I force them to pull requirements out of the air, I give them a base product. Usually, it's an open source CMS of some sort, for which we have a shop of developers. Load up a few basic add-ons, then let them play. While they kick the wheels on their time, I can watch what they do from my office. Every so often, I check back in, and we refine what we have by slashing things that they never used, and making what they did use more effective.

True, I'm a bit more technical than some, and a bit less fearless. Honestly, though, what I'm doing doesn't require a lot of code slinging. Plone has a feature where you can import UML to create a product. It looks like the Django community is working on one too. If you can make Visio work, I argue that you can be taught UML.

There are downsides to this process. It takes longer. You're dependent on the client actually kicking the wheels when you're not at their desk. Devs come in at a later state, which some object to. It relies on open source or out of the box products. It assumes your customers are the kinds who can change their minds.

Still, in my three betas, it's going well. It may not have the rush of a grand vision, but at least I'm reasonably sure at the end that it'll get used.

Monday, May 5, 2008

PIE. PIEPIEPIE.

Project minions often have a misconception about requirements gathering: that it's a one time deal. An analyst goes to the client, has a sit down with donuts and coffee, and leaves a few hours later with a laundry list of items for people to code/skin/write charge lines for.

If only.

It's more like going to the grocery store with a hyperactive significant other.

"Okay. We're here to shop for Thanksgiving. What do we need?"

"Is this going to take very long? Because the game's coming on, and..."

"I'll make it as quick as possible. First on our list: the main dish. I was thinking--"

"Can't we start with dessert? That's more fun."

"It's at the other end of the store. Anyway, we should pick out the turkey and ham first, right?"

"Sigh. Fine. That one and that one."

"Are you sure, because I don't think the turkey will be big enough to feed--"

"What are you talking about!? It's HUGE! So, desserts."

"Sigh. No, now we do sides. Here's a list of recommended sides."

"Wow... that's a lot of sides. Why do we have to have so many?"

"Because people like them, and hundreds of Thanksgivings have shown us that this is a good standard load."

"Feh. Halve it. We can spend what we save on pie."

"Fine. Whatever. You get to explain that decision to grandma. Now for drinks..."

"PIE!"

"Okay! Okay! Pie! Go get your pies and then we can talk drinks."

Scamperscamperscamper

Waitwaitwait

"Back!"

"That is WAY too much pie. That's nearly our whole budget!"

"But... but... Carl has this pie at his dinner, and Jennie has this pie at her dinner, and Lisel has CAKE, and this one..."

"Just because someone else has it doesn't mean you have to have it!"

Poutpoutpout

"Sigh. Fine. We'll have turkey and pie. But you get to tell everyone why."

"Everyone will love it. You just wait and see."

There are very few people who love turkey and pie, exclusively.

This is what gathering requirements is like to me, when I try to do it in one sit down. There are, of course, alternatives. Those, next time.

Note: Before the world thinks that I'm portraying my SO, I would just like everyone to know that my darling husband grocery shops like someone has leaked poisonous gas into the store. He is, however, prone to sneaking Pop-ums into the cart when I'm not looking.

Saturday, May 3, 2008

Beckett and Development

You know, I don't really know why more developers and otherwise techy types don't love postmodernism.

I love postmodernism. It's the one period that, on it's own, can get me tossing the kids at my mother and putting my butt on a train into DC. I hated it that I never got to study it in grade school, as any time after WWI or II, the budget and time generally ran out.

With postmodernism, you don't need to worry about what went before. The author doesn't write about the way the dew falls off of the leaves, or the way the heroine's red hair dances in the wind, all the while meaning to talk about the way these things are affecting the subject. A postmodern artist will just say it, if it's that damn important. What's important is cut down, too. Who needs chapters of material that set up the world? A story is a story, no matter where it is. If it needs to be on a farm or in a shop, then assume the reader knows what a shop or farm is and move on.

Also, with postmodernism, you get a chance to take it as is. You don't get this with Shakespeare. Shakespeare's works are great, don't get me wrong, but there's four hundred years worth of people talking about Shakespeare, out there. There's not much chance you're going to have a take on something that someone, somewhere, has already said. It reminds me of my high school English teacher who would shoot down some of our more creative interpretations with a "If that were valid, someone much more worldly and apt to turn in assignments on time would have said it already."

I mean, we're already on recursing Shakespeare, writing plays about his plays.

You also get to forget all the old symbolism. In older literature, we are inundated with references to the Bible, mythology, famous (always famous) works of literature... and it almost always means the same thing. After a while, you're wondering if the whole Christ/Icarus/etc figure thing hasn't jumped the shark a bit.

When I mention to my tech friends that I like postmodernism, I get 'looks.' Blank looks. Disgusted looks. Looks like I admitted a strange fetish casually over morning coffee. I don't get the fear of it. After all, I don't need to know when a po-mo author was born, or where. I just need to get to reading. I don't need to cut through layers of description about castles and forests and ladies dresses. I don't need to worry about what other people thing about it. It can cross cultures as is, and can be interpreted in a number of ways without having post-docs burning you in effigy.

It's the way most of them seem to want to develop, or what they see as 'beautiful code.' Pure and functional, all on its own. Reusable for other systems. Not bogged down in a history of infighting and red tape and better practices that really weren't.

We already translate literature into other languages. Maybe we need to start translating literature into perl. That, or start handing them copies of Waiting for Godot when departmental abusrdism gets a bit high. At least they'd know they're not alone in their featureless landscape.

Tuesday, April 29, 2008

An IM with IB

Me:
We are going to the grocery store tonight, per your request
What do we need?

IB:
my request? as I recall, I asked you if you were planning on going to the store and you said yes
>.>

Me:
My request was generated by your need for a sandwich, and that need depleting our bread supply below standard levels.

IB:
my inquiry was clearly couched in such a way as to determine a pre-existing scheme to visit a food and goods selling establishment

Me:
Ah, but that inquiry was submitted at such a time where alternatives to the bread need were limited, and alternatives would have still lead to a need to go to the store.

IB:
At said point in time, you had no way of knowing the reason behind my request, as no mention of bread supplies had been mentioned

Me:
If I recall correctly, it was the usual time for dinner deployment, so any requests in that timeframe are considered 'dinner' ticket items, rather than general requests. Also, you submitted your request in the venue of the kitchen, further flagging it a 'dinner' item.
...
F* I've been documenting too much.

IB:
haha
I win

But what color is it?!

You have some really strange conversations when you're trying to document a project. For sake of tech neutrality, I'll use an analogy: house building. We've built a house. Now I need to document what color the walls are.

"What color did you paint the walls?"

"They were yellow a few months ago."

"... But what color are they now? Are they still yellow?"

"Industry standard is eggshell, but of course, we didn't go with that."

"Okay... they're not eggshell. I don't think they're yellow. What color are the walls now?"

"I'm planning on painting them green in the future, but the current budget didn't allow for it this time around."

"What color are the walls now? At this time? At this juncture in the space-time continuum?"

"I just want to let you know... the customer insisted on the colors. We had nothing to do with it."

"Do you want me to guess? Are they blue?"

"Pfft. Everyone knows blue isn't scalable."

"Please. Just tell me. What color are the walls?"

"I used Benjamin Moore flat, applied with a 2" polyfiber roller with edging done by a 3/4" angled camel hair brush."

"There was not one color in that sentence."

"But it's important. Write it down."

"RRRrrrrr Fine. I'll go look myself."

Stompstompstomp

"Okay, they're pink."

"Technically, they're salmon."

Sunday, April 20, 2008

Life as a documentation fairy

Being a business analyst means that, when you show a slight aptitude for doing something, you often get to try out that aptitude on a task that's a few orders of magnitude above 'beginner.' Sometimes, I get to do exciting things, like code, or do graphics, or re-learn CSS. This go around, however, I'm doing documentation. I've learned quite a bit about documentation in the last few weeks.

Nobody likes documentation. Everyone knows they should be documenting a project as they go along. Everyone starts off with the best of intentions. Of course they'll document everything! There'll be a wiki, a discussion list, a note taker for every meeting, glamorous spreadsheets and presentations... And then the end draws near, and in the midst of the final crunch, it occurs to people that, maybe, just maybe, they'd left a few things out. Like an updated server architecture. A few tables of their database. What their application was actually supposed to do in the first place. Little things like that.

Enjoying documentation can make you odd. It's like being a coroner. No one, when they're five, says, "Hey! I want to cut open dead bodies and weigh stomach contents for a living!" Even me, in spite of what you may have heard. No one enters a career and thinks "Gosh, I want people to cringe when they see my email pop into their inbox." So you find odd reasons for liking what you do. Some coroners like the hours, or the strange things one can find lodged in someone's eye socket. I'm finding doing activity diagrams rather soothing.

Everyone thinks they document. It's kind of funny, when you start going around, asking for documentation. What people have is never, ever up to date. The best I've gotten was a graph that was a few months old. It's like being an oncologist in a town of herbal remedy folks. They think they're doing great until they realize they're losing vital bits.

Documentation specialist != therapist. Everyone just thinks it does. No project goes smoothly. None end with everyone thinking everyone else is just the keenest people ever. If you offered a group that had just finished a long, painful project a free vacation to an isolated beach house, I guarantee the only ones who would take it up have malice on the mind. You'd pick them up after a week, and there'd be suspicious bleach stains everywhere. I get this vibe from the venting I get every time I ask someone for something. I keep expecting it to leak into the diagrams I get handed.

Everyone swears they'll be better next time. Like the teens who forget the condoms, they swear, next time, they'll plan ahead. They'll keep track of everything and plan ahead and not toss extra man hours at what is a not a man power issue. Really.

Next time.

;\

Saturday, April 12, 2008

elephantangelchild

I've always loved e. e. cummings. I'm not a huge fan of poetry, and for my post-modern, I usually go to film, but there was way he could bring disparate elements together and make them make sense. Like Calder junk fish, where a bunch of nonsense objects are put together in a way that's whimsical, but a little sad. The whole is made of these tiny junk objects that have come together for a short time to make sense.

Plus, cummings has the habit of driving people crazy, and when isn't that fun to watch?

I'm a person of disparate elements. I draw. I program. I write. I design. I'm usually good enough to get recognition by my peers, but not an expert at any of it. Maybe I could be, but I've always worried in back of my mind that I would lose a skill if I focused too much on another. It's why I chose to go into psychology. For all it's reputation of leather couch toting nodding hypothesisers, it actually is a field that requires you to be diverse in your abilities. Math for reading and conducting studies. Science for understanding chemical interacts and neuroscience. Writing for creating readable journal articles. I even fit in art on the margins of my notes.

These days, I'm what they call a business analyst. I've come to the conclusion that it's corporate speak for "We don't know what they do, but we bet the do it all." I've been where I am for over a year now, and I've done the following:
  • Programmed
  • Designed the UI for a site
  • Created graphics
  • Written copy
  • Translated a UI from an image into functional CSS
  • Written manuals
  • Been a sys admin
  • Held the occasional hand when a customer was at the end of their rope
I get the push now and then to hop a fence, and go fully to bed with a certain group. I'm happy balancing on fence posts, though. I may teeter more, but it's more fun being uncertain.