Go to Making Light's front page.
Forward to next post: Where’s Victor’s Manuscript?
Subscribe (via RSS) to this post's comment thread. (What does this mean? Here's a quick introduction.)
Ars Technica’s Jon Stokes on “IT Consumerization and the Future of Work”:
[T]he cheap, ubiquitous transistors provided by Moore’s Curves have completely changed the sites—and by this I mean the actual physical spaces—in which we develop our sensibilities and expectations about what technology can and should do, as well as how it should behave. The end result is that the office has gone from being the place where you spend time with cutting-edge technology, to a technological boneyard where you’re perpetually trapped about three years in the past. Meanwhile, the new tech Meccas are retail spaces like Best Buy and the Apple Store, where you go to run your fingers over the future, and maybe take a piece of it home with you. The end result is that consumers bring to the office the expectations that they’ve developed through their interaction with consumer hardware, and in most cases those expectations are frustrated by the reality of corporate IT.Stokes isn’t dismissing corporate IT departments’ reasons for handling things as they do, so those of you who spend your days trying to keep your incorrigible users from downloading network-trashing viruses from AOL Instant Messenger can lean back from the let-me-tell-you comment you were probably about to post. He is, however, observing that as virtualization becomes easier and easier, and the attractions of “cloud computing” more substantial, more and more business users are going to demand that their IT departments allow them greater individual discretion over their work-related technology than they’ve enjoyed in the past. In an increasing number of fields, not just the tech sector, IT flexibility will be a significant recruiting factor. Certainly companies that enjoin their employees to master the Internet and develop forward-looking business models while simultaneously chaining those same users to locked-down computers and noisome net-nanny programs are going to find themselves falling behind in the struggle for talent and innovation. This is obvious from the trenches; it will be interesting to see which companies, particularly which media companies, come to find it obvious from the executive suite.This phenomenon is also at work on the network, where users develop their sense of how networked apps (messaging, collaboration, and archival) should look and function through daily contact with the lively ecosystem of consumer-driven Web 2.0 applications. Next to something like Facebook or Google Maps, most corporate intranets have an almost Soviet-like air of decrepit futility, like they’re someone’s lame attempt to imitate for a captive audience what’s available on the open market.
Capabilities :) Have your cake, and eat it too :)
i worked at home for most of five years without a complaint from our clients or any other such thing.
Then they reorganized and I got a boss who was offended by the amount of money made. And she started questioning every single move I made and delibrately started to throw me off by that method. she ended up 'laying me off' because I had too much sense to go, "F-you, I QUIT."
And prospects are very thin. I'm starting to get anxious.
The company where I work has lots of web-based stuff, most of which needs to be redesigned because the user interface sucks. 'Unfriendly' is the mildest word I would use.
Not to mention, what kind of password-management/security system expires passwords every three months without warning the users, even on stuff that most of the users only need every six months?
(Our hardware isn't bad; most of the stuff in my group is fairly new.)
That should sound familiar to anyone old enough to remember when PCs and Macs started to infiltrate the workplace - resisted by IT, then triumphant, finally subverted and controlled. Of course that sounded familiar to anyone who saw minis sneaking into departments and labs, resisted by the mainframe acolytes in IT. Now it's coming around again. It's almost as though there were some iron law at work.
I highly recommend parking lot sledgehammer retirement ceremonies for outdated tech equipment as a morale-building technique. Remember to wear proper safety equipment....
The analyst firm Gartner has been talking about this since about 2005. They consider it "the most significant trend affecting IT." (IIRR, they coined the word "consumerization.")
The subject has come up in some of the interviews I've conducted with CIOs and other network professionals for some clients I write for. The general attitude seems to be appreciation for the potential to accelerate innovation (and one did mention the recruiting angle) mixed with a great deal of apprehension about security.
Security has a real bottom-line impact--one analyst firm found the average cost per compromised customer record to companies in 2007 was almost $200. One company lost $118 million due to compromised wireless networks in just two of their 2200 retail stores.
There's also the compliance angle. These days, security is as much about compliance with regulations like Sarbanes-Oxley as it is about keeping the bad guys out--maybe more so. And frankly, considering how the deck is stacked against consumers in their struggles over privacy with corporations, that's probably a good thing.
Earl @ 5
We rather like the idea of taking stuff up to the roof (50th floor, more or less) and dropping it off to see how well it bounces. On a weekend, so we won't actually have to worry about pedestrians.
One could certainly read the consumerisation of IT against the way that IT helps to extend the workplace into domestic spaces for an increasingly large percentage of the workforce.
I'm not sure if these two trends are in tension or mutually reinforcing.
This is essentially what I've been working on for the past, um, four years. I think we're making some interesting progress but we have yet to ship... ah yes, a familiar story. Of course one of the difficulties is hitting the moving target that is the state-of-the-art: one of the reasons that corporate web applications look so dated is that they're designed for the state-of-the-art at the start of the project and by the time they're complete it looks like yesterday's dog food.
I can't go into too much detail about our approach, but the general idea is hardly a secret: to apply some of the ideas and features of successful collaborative websites to corporate/enterprise applications. I know plenty of other places are working on it too, with varying degrees of success. One of the problems is that most enterprise software isn't much fun, so even when open-sourced the crowds that gather around fun stuff like Wikipedia don't necessarily show up.
Now, if you can help people get promoted, that's a different kind of fun. My boss has a line about how we're all in the entertainment business; it's just that the entertainment provided by buying enterprise software is that you get a promotion and a raise and a nice new buzzword on your resume.
One of the things I've always liked about working on software is the idea that you can do away with drudgery and leave more time for tasks that are at least partially creative or at least challenging. Replace data entry with OCR, or paperless systems; replace laborious manual verification with automatic checks; replace phone calls to indifferent customer support staff with a website you can search and find the answer in 30 seconds. Of course this doesn't always transpire. And there's plenty of potential for software merely to enable more efficient drudgery, to treat people like emotionless interchangeable components in a process and drive them to the point where they break.
And back to the point, sort of, there's a different kind of drudgery that comes by accident, when software is slow or faulty or requires workarounds or is just plain stupid. Nobody means to make that kind of software, but large organizations where people's roles and budgets are rigidly determined are particularly prone to the problem. Once every feature on the design spec is checked off, the project is declared done, and often the team responsible is broken up again and assigned elsewhere. So no improvements are ever made once it hits production, both because there's no budget or time for it and because the particular grouping of expertise that created it has been broken down.
My department makes IT twitchy.
The rest of the company (or what I've seen of it) is built on the premise: one person, one computer. Set that person and computer's permissions according to what they should be, and it doesn't matter exactly which part is "person" and which part is "computer."
I work in digital image processing. We have a staff of about 8 (I forget how many the night crew are), and 11 computers in 7 workstations. (11 PCs, that is; 3 other computers with a proprietary weird OS for image database manipulation only. We export from that to our "normal" computers.) With slightly-to-greatly different software on each, for different processes.
IT wants us to be logged in as ourselves when we're on a computer. Which seems reasonable, I suppose. Except that sometimes, we swap around several times in an hour--and it would mean every computer would need every person's profile, because that's how Windows works. Oh, and then we'd need logins for the occasional temps.
A login arrangement like social networking sites would be wonderful; they manage to allow individuals varying access to software and files, and logging in and out is simple... but our IT dept isn't going to rewrite LiveJournal code to allow various people access to Photoshop, or the archive drive, just so they can keep track of who-renamed-which-tifs.
The IT crew has my sympathy. Apparently, having six people logged in as "scanning-employee" gives the servers fits.
Mt experience of corporate IT, at a pretty low level, suggests a very different environment.
The company had many offices scattered across the country, with no local IT support.
So they had a bunch of computers with UK-layout keyboards, and every copy of Windows set to US-layout. And nobody had the access privileges to fix it.
The company did training, and other things. I was using their services on the "other things" side, but the fortnightly customer survey form was written for the training side.
Some of the people working there were OK as people, if not up to Making Light standards of intellectual dexterity. One or two had a distinct empathy disadvantage (and I suspect they were running a cheap labour scam with a certain local business), and one young man who came to work in a cheap supermarket suit, adorned with an array of facial piercings, whose brain seemed to have been rewired with a specification written in bureaucratese.
He apparently frequented the sort of bar where you get regularly interviewed by the Police, as a witness.
Of course, if you don't have the right papers you just don't know anything about computers. Even if you started out using a keypunch, and played Trek and Advent on a teletype.
Xeger, is capabilities a reference to capability based security?
actually I came into the thread to do some sort of snark, either a gentle ribbing that Patrick obviously had some work-related IT problems to get his dander up, or that I write this same blog post every two weeks or so on my work related blog.
But then the first comment interested me. It's not a subject I expected to see at Making Light.
As for what Gartner talks about, I have never understood why anyone pays them for anything considering the amount of times I've seen them wrong by just parroting the conventional wisdom. Case in point, I remember they were a big pusher of SOAP based webservices, everyone went for it ( not just because Gartner said it was great of course) I of course got a reputation at work as a half demented guy for my opposition to SOAP and the attendant wreck of standards. Flash Forward some years and the guy in charge of the multi billion dollar webservices project comes to me at lunch and almost crying says that Gartner was by and said that the Webservices 'wars' were over, and SOAP lost, REST won. I had to console him that things weren't that bad when my natural inclination as a heartless bastard was to do a snoopy dance.
oops, ran off topic.
Also:
"one young man who came to work in a cheap supermarket suit, adorned with an array of facial piercings, whose brain seemed to have been rewired with a specification written in bureaucratese.
He apparently frequented the sort of bar where you get regularly interviewed by the Police, as a witness."
Frankly these do not seem like the characteristics of a person whose brain would have been rewired with a specification written in bureaucratese.
Furthermore, none of the characteristics mentioned (other than the bureaucratese rewritten brain [I think rewritten works better in the context of bureaucratese]) seem to me ones that would keep a person from also possessing "Making Light standards of intellectual dexterity"
For years I did think that this would be the thing that drove me out of my current employment/career. It feels quite a bit better now, not because my working computing environment has improved (the hardware is fine apart from being PC, but the software is still awful) but because lots of web 2 stuff I use isn't blocked, and my iPhone gets round the rest (though I'd like it to be faster).
But I still get frustrated; despite being very careful, I often find it impossible to do my work without violating some aspect of our IT policy, I can't use a single smartphone for home and work, I still spend an inordinate amount of my time doing informal IT support for my colleagues, and we still have no access to IM or most online video, despite their obvious work-related benefits.
I've become part of IT only in the last 6 years or so; I've been working with computers for more than 30 years, and with other kinds of electronics before that, but IT was a shock to me. Oh, I've had to deal with the "IT department" before; the typical response to them from me and my peers was to ignore them, and change the root password on our computers, locking them out, if they gave us any hassle. Giving them control of our development and testing environments would have been corporate suicide in most cases, since they rarely understood the requirements of software development, and never understood what sorts of hardware and software we needed.
So I've gone from a creative environment where I was effective and useful, to a constrictive environment where I'm not very productive or effective, and where I'm constantly having to build and maintain systems that were put together using chewing gum, bailing wire, and bird shit. And I'm constantly being told that there is only one right way to do something, the way that's based on the current buzz words and management coloring books, and it's all new and improved, and no one could ever have thought of it before, because the current CIO saved us from the incompetence of the previous CIO (and is looking around nervously wondering when the same will happen to him).
On, no, not bitter much. Just sick and tired of being forced to use re-invented wheels and being told about some new "agile" methodology that somebody's making a killing writing a book about.
In my old (large, very corporate) company, the IT department had many of the problems described. The Intranet was...quaint...and the only search engine available was home brewed. No chat clients, no wikis. Heavily filtered internet if you had specific approval and a business case; otherwise, no internet at all.
The result was, of course, a black market in access and information. No one had more friends than the guy who could pop by and install software on your machine. The IP address of the experimental Google box spread like chickenpox in a preschool. Under the table wikis flourished. People used the net send command for chat. Every time they would put a lid on something, the functionality would wriggle somewhere else. Every time they would implement something, it was so crippled that no one could use it.
They're probably still festering there. I moved to a small company where we have open access and are expected to show some sense about what we download. Make a mess, and our skinhead death-metal fan* of a sysadmin will be very sarcastic with you as he cleans it up.
-----
* Seriously. He even has fangs, due to an accident of tooth formation.
Oh, man. The worst is when the IT guys know & want to do the right thing but are unable to do so because of stupid policies and no budget.
At my last job I had to spoof a MAC address and install my own router because their "system" would boot your MAC address off the "allowed" list if you didn't log on in the office every so often. I never figured out how long "every so often" was, but, since I telecommuted, I didn't make the quota.
When the IT guy saw my setup, he just said "I didn't see that!" and shook his head. I swear, IT-wise, it was like working in a particularly boring episode of Hogan's Heroes.
The other office "banned" all outside email. Hotmail, Gmail, Yahoo, you name it. Except ... the other office was in Europe, and everyone's phone could read email. So instead of people taking little breaks to check personal email during the day, they take LONGER to go outside and check it on their mobiles.
elfwreck #10: Much the same here. I work doing graphics for a contractor within an investment banking arm of a large bank. Our center is 24/7 and supports bankers worldwide (though there are subsidiary graphics centers in London, Chicago, and San Francisco), and each workstation is used by at least three people each week. We also have different software sets on our machines (ameliorated somewhat by the fact that we're just moving into a new building this week, with new computers that have been freshly set up).
IT has to keep our machines tightly controlled, unfortunately. Otherwise we might actually be able to install the software (and fonts) we actually need to do our work most efficiently. (Of course, with six or seven of us who share the higher-end machines, each with a different idea of what constitutes "the software we need to be most efficient", it might be for the best that business concerns require a lockdown. Except for the fonts, of which we're supposed to have a standard set -- except that IT managed not to get them installed properly at all machines in the old center.)
We'll see. I don't actually move to the new building (and thus the new machines) until next week. But it's for sure that something will go wrong, or be missing, and I'll be swearing at IT for taking forever to fix it and why can't I just do what needs to be done I can do it better than them goddammit -- oh. Yeah. Six of us, all saying that, about the same problem, with different fixes...
bryan@12 wondered ::: Xeger, is capabilities a reference to capability based security?
Yup - I'll post more extensively about capabilities for the rest of the audience (who are doubtless wondering WTF) when I've made it through the (insert appropriate set of negative adjectives) work day.
No idea why you'd be surprised to see capabilities based security mentioned here, though -- it's a wonderful melange of folk from so many places that I'd be more surprised to not find somebody with their tendrils involved in it.
Bruce @10, et. al.,
I speak as an ex-IT security department member - not the best credentials in the world, since I now work as a security software vendor, but I was there when it happened. There is a reason why IT has to keep your machines tightly controlled. The reason is risk-coverage. If you remember back to the mid 1980s and early 1990s, you'll remember an environment where companies were being sued for having illegal copies of software on their corporate machines. It didn't matter if the company had authorized the software or not, the company was still considered legally liable for the copyright violation.
A lot of CIOs were told to make sure that their company's name was never part of a headline in the Wall Street Journal for this crime. The results are still felt today.
At some companies, a two-tier computing model is practiced, at least for things like Internet access. The bulk of the corporation gets filtered access; the VP's and above get the unfiltered Internet.
Friend of mine works for the U.S. government and says it is the absolute worst in her office. I understand they have some security concerns, but breaking your entire computer system is not the way to fix that.
Me, I just have to deal with being told that I'm a wimp for owning a Mac. I explained that it runs X Windows and got a "Well, but I bet you'll never figure out how the filesystem works!"
Macho engineers don't believe in user interfaces.
I CAN'T EVEN SWITCH MY BLOODY KEYBOARD FROM DUTCH TO US AT THE OFFICE I CURRENTLY WORK AT!!!!1!1! [1]
"Technological boneyard" indeed.
[1] Which means everytime I have to input a single or double quote, they don't appear until I hit the next character after that, and if that's the wrong character, I get an á or an ç or even a ä instead.
In my experience corporate IT stuff is old and quirky for two reasons. First, often it was built inhouse long before similar products was openly available. It's been surpassed in most respects by standard commercial products now, but because it has been carefully adapted to the business processes of the company, it's kept going. Second, like your car or your furniture the hardware and software were bought a while back -- perhaps as much as a decade ago. Of course it isn't as spiffy as the stuff in the showrooms right now, but neither is what you have in your living-room. Is there some reason to expect your office to be any different?
Next to something like Facebook or Google Maps, most corporate intranets have an almost Soviet-like air of decrepit futility, like they’re someone’s lame attempt to imitate for a captive audience what’s available on the open market.
Of course. Apps in consumer-space have to live or die by what they can deliver to the consumer. Apps in corporate space can get by without delivering at all, as long as their salespeople are shiny enough to impress the person in charge of hiring. And once the company has been sold on the product once, there's a tremendous amount of internal inertia that keeps the product in place.
I'm one of the points of contact for a product my company uses, and let me tell you: their sales people will sell us *anything*. Absolutely anything. Regardless of whether the product can actually do it. Working with their engineering people is fine -- they're pleasant, upfront about their capabilities, and I enjoy working with them. Their sales staff? ARGH. (And double-argh when it's someone else they've promised something impossible to, because then it's my responsibility to find out why they haven't gotten it done yet.)
I know of one company -- not my own, thank Ghu -- whose employees spend their days researching. They're researching information in all different parts of the country, so they routinely have to check different county, state, and local databases and webpages to find information. Completely natural and required by the duties of the job.
And their IT staff makes their researchers *specifically request* any webpage they want to visit. Seriously. Any page. Want to check out morrowcounty.sc.us? Have to get specific permission. I can't imagine how much productivity that policy is losing them.
I've also met fellow road warriors who always carry two computers: one computer their IT department requires them to use, and one computer that they can actually use to check email or watch a DVD or check in for their flight.
To generalise from Lauren's point @19, a lot of ICT managers at all levels are averse from implementing anything at all, hard or soft, which is new enough to be potentially buggy. Let the other guy carry the risk is the mantra. So you have a load of companies eyeballing each other until somebody cracks first, by which time the product is obsolescent.
You, on the other hand, bought the single user version the first Saturday after release, and if it didn't work you swore a bit and spent some time in the forums till it did, or you ripped it off your machine. You risked $40. The manager who didn't buy it also didn't risk her job.
I’ve written both in-house and shrink-wrapped software, and the shrink-wrapped software is almost always much better than the in-house software. There’s a good business reason for this.
Let’s say that adding a particular new feature to Frobnix 2.0 will save each user one hour per year, but that the feature will take me 100 hours to implement. If Frobnix is in-house software with 20 users, then paying off this feature will take five years. But if Frobnix 2.0 is shrink-wrapped software with 100,000 users, then paying off this feature will take under half a day.
So for very pragmatic business reasons, shrink-wrapped software (and popular websites) will always get more love than your average in-house order-taking application.
There are several ways to ameliorate the pain, however. First of all, you should never build an in-house version of anything that you can download from SourceForge, or that you can purchase on a departmental credit card. So there’s no excuse for not having wikis, decent bug trackers, or anything else like that.
But what if you could buy the software, except for the fact that it costs more than (say) your departmental manager’s $1000 discretionary limit for commercial software? In that case, the build-versus-buy distinction gets tricky. Let me recommend two interesting articles:
Camels and Rubber Duckies, by Joel Spolsky
Notice the gap? There's no software priced between $1000 and $75,000. I'll tell you why. The minute you charge more than $1000 you need to get serious corporate signoffs. You need a line item in their budget. You need purchasing managers and CEO approval and competitive bids and paperwork. So you need to send a salesperson out to the customer to do PowerPoint, with his airfare, golf course memberships, and $19.95 porn movies at the Ritz Carlton. And with all this, the cost of making one successful sale is going to average about $50,000. If you're sending salespeople out to customers and charging less than $75,000, you're losing money.
But once you start paying $75,000 for a software package, the quality drops tremendously. There’s two reasons for this: (1) As before, the expensive software is amortized over a smaller number of users, and (2) the decision to purchase the software is made on the basis of PowerPoints (and maybe a golf game or two), not on the basis of quality.
How to Start a Startup, by Paul Graham
It's worth so much to sell stuff to big companies that the people selling them the crap they currently use spend a lot of time and money to do it. And while you can outhack Oracle with one frontal lobe tied behind your back, you can't outsell an Oracle salesman. So if you want to win through better technology, aim at smaller customers.
So if you ever have to make a build-versus-buy decision for in-house software costing more than a couple grand, you’re basically going to lose either way. If your programming team is good enough, you can get something nice and shiny—but still not as good as software developed for a larger user base.
Johan Larson @23, the trouble is that a car or a couch isn't a computer. Your couch's functionality doesn't depend on anything but itself. That's not true of a computer. People outside the company have expectations about what your computer will be able to do -- run certain pieces of software, open a PDF without crashing -- and if you can't do it, it impedes business. Furthermore, there have been tremendous innovations in the last ten years that significantly improve productivity. Why hamstring yourself and your company?
This line of thinking -- well, my computer may not be new and fancy, but neither is my car -- is what led the father of a friend of mine to try to work with the same computer for 15 years. He couldn't back up the articles he was writing because the backup media could no longer be purchased. He couldn't open articles that other people were sending him, because his computer had no idea what this stuff was. When the machine finally gave up, he was pretty well out of luck.
It's not just about something looking new and shiny. It's about actual functionality.
I seem to recall Borland Delphi being touted as a solution: a tool to quickly build a modern user interface which could talk to an existing back-end.
When did DOS-style line-drawing characters become part of the Windows environment?
I wouldn't be surprised if some software out there still needs to run on a FAT32 filesystem.
Ooh, ooh! I've heard of capabilities! (Sigh. I'm starting to feel having been out of the software game for five years.)
Lauren #19 (and chris y #25): I understand perfectly well why IT keeps its grip so tight. I don't have to like it.
Of course, their tight grip saved my job (though not without a false alarm).
We'd recently had a companywide* talking-to about bypassing proxies to get email, use of USB-key-based software (e.g., Firefox) on machines where stuff couldn't be installed, and not logging into our web-based accounts (and particularly for stuff that verged on NSFW, such as dating sites). I'd finished a (graveyard) shift and gone home at 7 AM. Apparently, the next person who logged into my station hadn't gotten the news (or, more likely, disregarded it): he logged into either MySpace or a dating site (the details are unclear) using Firefox and somehow downloaded some malware. (Three very quick strikes!)
IT, being on high alert at this point (immediately after a directive to tighten down) noticed this and called the center. But their software indicated that I was the "owner" of the station, having logged out of it last, and they tried to tell my boss (who leaves after me) that I was doing these things. Fortunately, he knew I'd left, was able to confirm with IT that it was the actual person who was doing this, and he took the long walk out with Security not too long after that.
I heard the story when I came in that evening, utterly unaware that I'd missed being fired by a hair -- and having learned that I have a really on-the-ball boss, and that IT, when they want to, CAN do their jobs right. (Motivating them is a real issue for a cost center in a large profit-oriented corporation.)
----------
* We're technically not part of the company, being contractors. But the rules apply to us, too.
My (large, state government) office switched this year from a mainframe payroll program to an
"off the shelf" type, one size fits all, multifunctional payroll program that tied ALL state employee pay/leave information into one neat package.
We were told said program would be more efficient (no more printing of paystubs! All your pay and leave info online!), easier to keep track of your time (it's online and updated daily!) and more accurate too.
It's been none of the above. Yes, Payroll no longer has to print these handy little paystubs that are 3.5"x7.5" in size; instead, everyone has to print their pay/leave info on an 8.5"x11" sheet. Easy to access? Uhhh, no; it's been 7 months and we STILL cannot see how much leave time we've accrued online. Accurate? Who knows? People have had checks go missing, the wrong amount paid out, and leave time shown is subject to a variety of interpretations.
As far as our IT group goes, it seems they function only as a "this is the Way Things Will Be Done" organization, resistant to any suggestions by us lowly users on how things could be improved. It took an Act of God, it seems, to allow us to even put Google Earth on our PC's!
There's a fundamental tension between 'run servers' and 'desktop support organization'; trying to grow a windows desktop support organization into a full-up IS corporate back end for many people (thousands) is more or less utterly futile. Totally different skills, objectives, etc.
Where I work has a certain amount of IT policy, but it's also the sort of engineering company where IT is hopelessly outgunned in terms of technical ept, so I don't find it objectionable.
IT project implementation best practises, well, I should probably not talk about that without going through both Legal and PR. :)
Three years behind? Bwahahahah! (laughs hollowly).
Abi #15:
I knew a girl in college who had fangs; also black hair, pale skin and a profile like a neo-classical marble statue. She could have broken the heart of every goth in Toronto, but she couldn’t be bothered.
John L @ 32
It would take something like an act of Ghu to put Google Earth on the computers where I work, since corporate higher-ups don't want to pay $400/yr for each machine it's on (yes, that much). In fact, they made everyone remove it and blocked the download. FOrtunately, Google Maps takes care of most of what we need it for.
Notice the gap? There's no software priced between $1000 and $75,000.
One counterexample is Autodesk 3ds Max, which costs about $3500. It has a pretty specific audience, though.
I used to work for IT in a University environment, which was great fun throughout the 1990s. Lots of internally-developed stuff to develop and play with. But we gradually got more off-the-shelf and corporatized, and less fun. So I moved into user experience/interface design, which is much more fun and flexible. (PJ Evans@3 - if your company ever WANTS to redesign the interfaces for its web tools, I now work for an agency that does that kind of thing.)
Joel is right (about the gap in software between $5,000 and $75,000) although I might set the bar at $5,000 and not $1,000. It doesn't only come from constraints on the purchasing end, though, but also from the staffing requirements at the vendor.
Here's why: something can either be done with the product out of the box or not. If it can, vendors are often in a position to say "Fine, we'll take your $5,000; now have a nice day and if you have any questions feel free to take it to our support forums". If it can't, it needs someone to make changes to the product and for non-trivial changes that means scheduling extremely expensive programmers, testers, interface people, tech writers, project managers, etc etc and it's hard to start even the smallest project of that type for less than tens of thousands of dollars in direct salary costs. Count in context-switching costs, lost opportunity costs, and idle time waiting for specs for instance, and $75k starts to seem like the lowest amount you can charge where you don't wind up actually losing money, as I say even for the smallest of projects. Now, that kind of custom application at a large corporation is going to be worth millions of dollars (and up; way, way up) to them, so whether it's $75k or $275k may not make much difference as long as it gets done.
On the other thread: the crazy lockdown offices are failing to make an accurate balance between the value of security provided (if any) and the costs of lack of access to information. The boss of the constrained workers ought to be kicking the ass of the IT boss in meetings with their boss. That they aren't may be a consequence of the way that the money-and-control-vortex that is IT attracts powermongers, who then use it for a base of operations in intracorporate warfare. This can happen with any department that decides that outsiders are the enemy, but IT is uniquely positioned to make everyone else's life more difficult.
I agree with Jacob (#38). It should probably be $5000.
The currently-free developer environment for Mac OS X, in a prior incarnation, cost $4995 per user when it was the NeXT Inc. developer environment.
"Enterprise"-level development tools often cost more than $1000. Delphi 2007 costs $1999 for the enterprise version.
I'm reminded of a former cow-orker of mine who switched jobs and ended up in Software Sales.
His beat was: Western Europe. (He was the only salesman in the sector.)
His quota was: two sales per year. (He usually made four, so he was in bonus money and happy.)
If you bought the software, they threw in the free VAX to run it on.
Because there isn't much call for fab line QA management software, they had a captive market at $250,000 for an entry-level license ...
Another reason your work software is old and quirky: the company won't upgrade till the latest version is proven not to be a piece of shit. Vista is a very good example of something that shouldn't be adopted early.
3DS Max is an interesting example.It's a part of the world of CGI graphics, and one of a very few high-end programs, pretty well able to do everything. But the file format is pretty closely controlled. So you need 3DS Max to use the models, or one of the high-end programs with a licensed import function.
Luckily, you can download a time-limited Demo of the latest version of the program, and that will convert the models without hassle. About fout years ago, I did that with a collection of free models I had accumulated.
But there also extremely generous student discounts. So any company in the CGI business is buying a very capable tool with a large pool of users able to use it. And any company in the business is going to be buying more than one copy.
This isn't about staying within discretionary budgets for an office: it's a major production tool, that a company might need a dozen copies of.
Meanwhile, without spending anything like that much on software, I find people publish my CGI work--fanzine level, perhaps, but somebody made the choice.
I just linked to this thread from my own little consulting company's blog. (This is me hoping that semi-pluggy links are cool, so long as they're circular...)
This thread resonates strongly with me. I went independent in part to escape from all the phenomena mentioned here, and I'm now carefully pushing my company into a phase where it's partnering with other independent consultants while remaining free of shared physical office space. It's possible specifically because of advances in communication technology and even methodology, all of which are available on the consumer market, and half of which are free (or nearly-free) services.
I doubt that this is a work-style appropriate to any kind of business, but I'm fascinated at how it's lately becoming more and more feasible. I embrace it cautiously but hopefully.
Here's my perspective from a more macro level, having spent 7 years as chief architect for a business division of a Fortune 20 company with a lot of experience, some good - mostly bad, in these kinds of battles. Inside of every large corporation there are two kinds of organizations: staff (HR, IT, etc) and line (sales, product, manufacturing, shipping, etc). Both of these are vital components to a running a large enterprise, but they have fundamentally different success criteria.
Line organizations have a pretty clear value proposition: how much do you add to the bottom line. Almost everything a line organization does is focused on making money, therefore the people in line organizations are focused on making decisions, executing plans and producing results.
Staff organizations understand that they don't directly add to the bottom line. The only way they can justify their existence is by promising to reduce costs and/or risks being taken on by the line organizations. Of course in order to accomplish this, they must be empowered to inject themselves into the decision making processes and operational workflows of those line businesses. People in staff organizations are concerned with oversight, definition and adherence to rules, policies and procedures.
You can see that in an ideal situation, there would be a healthy tension between the "rules/process" parties and the "execution" parties and this balance would help drive the scalability and longevity of a large enterprise. (Let me know if you ever see one in the wild) In an out of control situation, one side dominates the culture resulting in either "The Dept of Motor Vehicles: A Process Success Story" or "Enron: The Smartest Guys in the Room" scenarios.
Technology adoption has been and always will be a frontline in this battle whether it's typewriters, phones, faxes, computers, PDA's, internet, etc. Nothing about this generation's latest and greatest tech has changed that or will change it. If you are going to play the game, then you have to acknowledge these motivations in developing your corporate kung-fu (when you can snatch the Red Swingline from my hand, then you will be ready).
I need to do another entire post about how this article missed it by that much. The story of this decade isn't about the effect of digital age technology on institutions, but how digital age tech actually enables organizing people without organizations.
Sarah, 34
abi, 15
I knew a girl in college who had fangs; also black hair, pale skin and a profile like a neo-classical marble statue. She could have broken the heart of every goth in Toronto, but she couldn’t be bothered.
Oh, how cute!
Speaking of fangs, I had a moment of internet coincidence - I was just reading the entry on Cute Little Fangs on tvtropes, and one of the tropers commented:
This trope is also Truth In Television, most Asian people get what's called tiger teeth, a genetic trait that results in elongated eye teeth. This troper can definitely vouch, since this troper has two sets of tiger teeth(one on the top row of teeth, one on the bottom) Especially the rest of this troper's family as well.All my mighty powers wrt teh Goog have failed me: can anyone here substantiate this assertion? (And/or direct me to the medical term for elongated eyeteeth.)
Imagine for a moment working at a large corporation, on a billion-dollar hardware project targeting a Window Vista only release.
Imagine that same company's IT department announcing they are "skipping Vista" and waiting for Windows 7.
Imagine that none of the Vista development machines is allowed to connect to the internal network.
Imagine the further joy when they told us our server had been "accidentally decommissioned in a non-recoverable manner".
And that the backups had failed.
And they wonder why we work behind their backs.
Paul --
Ki-yi-yi.
And here I was really mad when someone from IT formatted the partition with the production database on it. (Their backup didn't work; mine did. They were apologetic, and it was an actual mistake and all, but that was such a complete Monday.)
My division is/was in an interesting position.
The company that acquired us in 2005 was largely a hardware firm. Cable industry widgets. Very little software development went on there. Their IT didn't have to deal with the peculiar needs and preferences of software developers.
Like: Having desk workstations that didn't run Windows. Or that demanded that they be rebooted when Windows was updated. Or that ran more than a meagre set of applications.
The programmers I work with run cross-compilers, to create code that will run on a peculiar custom OS. I and the other QA folks need to have machines with set IP addresses that need to run for a week at a time, to handle test scripts. We all need archiving software to check code into and out of.
IT has learned to adapt. The guys and grrls I work with now tend to run PCs with Linux installed, to keep from being pestered by the demands of Window's security flaws. I still have an old Sun workstation, but as soon as there's a budget I'm going the same route.
Oh. We were acquired again at the beginning of the year. The new crew seems a lot more clueful. But there's the whole rig-a-ma-role of getting used to a new set of dumbass administrative suites that handle evaluations and workplace surveys and the like.
One of the reason for the tension between line organizations and staff organization is the regulatory environment. (This is my area of expertise at the moment.) Sarbanes-Oxley, Payment Card Industry Data Security Standards, HIPAA, California SB1386, the various ISO standards used across the EU, and the CCITT all require various security standards with various penalties involved for lapses. Sarbanes-Oxley can get the CEO jailed. PCIDSS can get your access to VISA or Mastercard terminated. SB1386 can cost your company a lot of money in bad press.
Most of these regulations require strict control of the desktop environment, to prevent unauthorized access to data, to prevent unauthorized distribution of data, to ensure authentication, authorization, auditability and confidentiality. If as a corporation, we permit applications to be installed without an appropriate review process, we set ourselves up for trouble.
That's me speaking as a security person. Speaking as a vendor of software, (just to mess up some people's theories, our software is usually between $15,000 and $100,000 USD) we have to work with this security. We depend on certain DLLs, APIs, naming conventions and other Microsoft supplied or mandated resources. If you have a non-standard environment, our software, which your corporation is using for specific business reasons, won't run. We have 1/5 of our company devoted to helping users customize the software in these situations.
Picture me saying in a pleading voice "Please don't mess with your corporate-issued computer, that belongs to the corporation, that the corporation expects you to use with the software the company I work for has sold to the corporation. It makes my day a lot worse."
Just for the amusement of the folks here:
We're building a GIS database for mumble-thousand miles of pipe (more than 5000). One of the software packages we swear at uses a server which died last Thursday. It's still down, and we don't have an estimated time to get it back. This is, as you can imagine, thrilling all the people who normally use it.
We're also scheduled to have that database down, toward the end of the month, for conversion to the new program we'll be moving to, instead of old cranky. (Another thrilling moment for everyone in the group.) Fortunately my work mostly doesn't use that database. Yet.
Just to provide a little balance to all of end-user horror stories. Imagine that you are a technology director for a line organization inside a large healthcare company (see Lauren's comments on Sarb-Ox, HIPAA, etc). Imagine getting a heads-up call from an IT director:
"Hey...heh heh heh...you're gonna love this...you wanna take me off speaker?" Imagine you start popping Pepcid and Aleve pro-actively as you close your office door.
"So, Smith (VP) at your XYZ office just opened a critical production ticket for their online sales site" Imagine that you recently inherited this business unit as part of a re-org and vaguely remember hearing about an online system from the high level briefings.
"Well, we can't find any of the servers: no web, app, db, nothing." Imagine saying something like "You _lost_ an entire ecommerce system? That's gotta be a record for you guys"
"Heh heh heh...Not this time smart guy. We finally did a dns lookup on the url and tracked the ip address to [redacting name of Really Horrible Hosting Service]." Imagine staring at the phone as the realization slowly dawns on you "They totally went guerrilla, it blew up on them and now..."
"Yeah, they figured they could just open a ticket and have us clean it up. I'm gonna close this out and file it as a User Training issue. I can give you 48 hours before I need to kick this upstairs. You owe me..."
Eric@26: I see a couple of problems with your first calculations. The shrink-wrapped software won't see any benefit from making the improvements (because people will hold their noses and buy it) or any pain from lack of improvements (because the peasants-with-torches never make it to Redmond et al), and many of the improvements will only affect a small fraction of users, making the payout time longer. Add to this that the shrink-wrapped product has a long cycle where in-house can be turned around quickly \and/ can be tested by the complainers rather than somebody looking at a complaint 4th-hand, and the advantages can get twitchy.
Not that in-house is free of problems; one person with a "vision" has fewer people to throw cold water on him, which can make the result anything from insanely great to just insane. And larger houses see inertia and lack of feedback even for in-house work; I remember one person watching the first demo of a new CASE system and saying repeatedly that he didn't work that way. (It didn't help that the author of several shell scripts didn't know some of the basics of csh....)
And count me among those unimpressed by Spolsky's numbers; both of the companies I've written software for have/had several products selling for the range he brackets (including some in the $5K-75K range). If he's talking about more generally-used software than I was working on, he \may/ have a point concerning total-package price (e.g., the salesman offers a many-seat "total solution" against off-the-shelfware that sells for <$1000), but that's apples vs. oranges.
continuing with Jacob@38: one answer to that is to have generalized solutions that assume some amount of adaptation for every site; the adapters don't alter the deep implementation, only the installation, so they can be immediately responsive without the overhead of making sure that no part of the product is affected by a change. That can also reduce the conflict between the new whizbang and the established workflow (as above).
And while I have relatively little sympathy for many of the crotchets of the IT departments I've known, I've also worked sysadmin in a small house where the developers had \too/ \much/ access; they shuffled the network protocols after hours so they could run multiuser dogfighting on their ultra-graphical systems (SGI, 20 years ago), and tended not to reset, breaking the small-hours automated backups I'd put in.
I recently moved (after 9 years) from the IT side of the house at an academic research institute to "line" systems engineering in the corporate world, so I've been on both sides of this line within the past six months. It's been an interesting transition, though there are a lot of factors besides the IT/line distinction involved.
Interestingly, Lauren's #49 applies to both environments, in different ways....
CHip @52:
You clearly don't have elongated eyeteeth, or they would have remembered to reset those protocols.
"Joel is right (about the gap in software between $1,000 and $75,000) although I might set the bar at $5,000 and not $1,000. It doesn't only come from constraints on the purchasing end, though, but also from the staffing requirements at the vendor."
well, at one job I suggested we replace the search engine we were using with a google mini http://www.googlestore.com/appliance/product.asp?catid=3 at the time IIRC it cost 2,500 but now it costs 2,990, doesn't matter because unless the price was +20,000 we spent more than the price holding meetings discussing the purchase, when you add in the cost of me having to do the analysis, the cost of other people's time etc.
Some things to keep in mind is, everyone hated the current search engine and agreed it had to be replaced, the scenario I described was that considering the low cost and the general reports of its success it made sense to buy and test it rather than to spend a lot of time discussing what we should buy (this was before I discovered OmniFind, nowadays I would just set up my own OmniFind search and demo it, and then waste the same amount of money holding meetings etc. but maybe succeed at the end because I would have something to show[although actually I've worked in enough places where that doesn't work either] )
The end result of the thousands of dollars in reports and meetings was that when they paid hundreds of thousands for their new Plone based site they used the unsatisfactory Plone search engine (whichever that is) with a consequence that more than a year after launch of the new site I can never find anything on it.
hmm, I got off track there. I guess what I'm trying to say is, no 1000 seems about right to me.
"accidentally decommissioned in a non-recoverable manner" (Paul Lalonde @46)
I like it; I do like it. I think there would be a few places it could be used in relation to non-computer circumstances as well :)
Lauren @49: oh yes, I used to work in computer access for a Very Large Pharmaceutical research company (on the site where they developed those little blue pills for frustrated gentlemen).
It's a lot easier to get users to conform to the standard desktop after you explain that if the environment isn't regulation, or if they log in as each other, then the FDA can refuse to license a drug that may have been ten years and several million pounds in development.
Just don't ask me about that time I deleted the entire research data... (it wasn't my fault, the data was multiply backed up, and the guys in ops needed the overtime anyway...)
"accidentally decommissioned in a non-recoverable manner"
Isn't that generally what a) cars do to wildlife and b) what GWB has done to the consitution.
Wait, sorry, scrub "accidentally" for the last one.
Pete @ 57, I worked for a bank that has since been completely consumed by a Very Large Bank, thus necessitating my transfer to software developing.
Our standard joke was that the Feds had three different audit outcomes - "pass", "we're warning you once", and "no more FedReserve access for you!" which completely destroys a bank's business. (You have to say it in the SoupNazi voice.)
VISA only has two audit states - "pass" and "you have shown us by your actions that you really don't want/need/deserve access to the VISA network. Goodbye!" (This one uses the Weakest Link voice.) Sometimes I think VISA is a tougher cookie.
I saw an archive recording of a lecture by Adm. Grace Murray Hopper when I was in college about the costs of data that was recorded sometime in late 1960s or early 1970s. She talked about the well-known costs of adding, deleting or modifying data in the database. She also talked about two costs that hadn't been considered before that time: the cost of doing without your data processing environment and the cost of being perceived as an organization that did not protect data that was part of the business of the corporation. She was a fabulous speaker and I think I can date my interest in Information Security to that very lecture.
Diving in quick without reading all the recent posts, because I have to journey behind the net nanny real soon ...
"Capability-based systems" are a really old idea, and a very good one, IMNSHO. I have a copy of "Capability-based Computer Systems" by Henry Levy, published in 1984 in front of me now, and somewhere in the singularity that is my technical library is a copy of Elliot Organick's book on the Intel 432 project, which was a capability architecture (I worked with a lot of engineers who were survivors of that project), but that's another, two-beer story. I used to have a complete set of hardware and software manuals for the i432, but finally decided it was probably the better part of intellectual property law to recycle them. A shame, that design was so far ahead of its time that it would be considered leading-edge in some respects even today.
IMO, the current Linux team, especially including Linus himself, has exactly the wrong end of the stick in insisting that capability-based microkernels are a bad idea for OS design, but that, too, is another story.
Anyhow, the basic definition of a capability is a token containing a unique identifier for some object in a computer system, and a set of access rights to that object. So any other object holding a capability has the access rights given by the capability to its target object. In a secure environment, each capability is signed by the target object or its representative, using an unforgeable signature (some token encrypted by the object's private cryptokey, for instance). The target object, and any other object that cares, can verify the signature and therefore prove that the capability is valid. Obviously there are a lot of other concerns, such as verifying that the capability was in fact given legally to the user, which are left as an exercise for the security officer. One major difference between capabilities and Access Control Lists is that each user has a separate capability, containing only its permissions, whereas an ACL is a central resource containing the rights of all the legal users of the target. This has some really fundamental effects on how security scales and functions across breaches of security and changes in access permissions in distributed systems which I will leave you to think about while I go to work.
yeah, basically I think ACL based security is a reasonable way to do security on any network where you know every user and your main security concern is not really users doing diabolical things to damage computers (although this is of course a concern) but that users not access documents that they shouldn't, for example the list of people to fire at the end of the month.
for a distributed system where almost of your users are anonymous? Not so reasonable.
It seems trivial, but one of the many reasons I gave up on teaching public school was the way the IT department locked down our computers. I would work on lesson plans at home, only to come in the next day and realize that 1/2 of the sites I wanted to use were blocked off, or used Java, which of course, our computers couldn't use. And blocking all web-based email -- what the hell is that about? Especially in a job where you are *very* hard to reach from 7:30 to 3:30.
Being treated as an untrustworthy child in that way was just a concrete example of all of the other ways that teachers are not taken seriously as professionals by their own employers.
But the kids...Well, I still regret not sticking it out.
abi@54: damn, I should have thought of that -- Jack's Joke Shop was still around then, so I could have worn a false set to make up for the deficit of nature.
hile you're being brilliant, what would you have suggested for the engineer-manager who left his ashes everywhere including my tape drives?
This thread gave me a shove to finishing off my latest rant on software and putting it up on my blog, so I thought it would be appropriate to let the readers here know about "Of Languages and Wheels".
#60: Two quibbles from my perspective:
the basic definition of a capability is a token containing a unique identifier for some object in a computer system, and a set of access rights to that object.
Us object-capability folks would say that the second part is not necessary; a capability system can equally well function by considering each distinct set of rights a distinct object (a facet).
(However, the target-object-plus-access-rights model has some advantages for some purposes (e.g. not needing to allocate facets) and can be looked at as building a standardized efficient facet type into the capability system.)
In a secure environment, each capability is signed by the target object or its representative, using an unforgeable signature (some token encrypted by the object's private cryptokey, for instance).
This is a crypto-cap system. There are also descriptor-cap systems, where the capabilities are opaque (not represented visibly as bits), and provided by an operating system/interpreter/virtual machine (much like Unix file descriptors; the small integer denotes "this process's capability in slot N").
This is how object-capability systems and most capability OSes work; the basic advantages are that there is no crypto involved and that capabilities can be distinguished from data. The latter property makes it easier (or possible at all, depending on the details of the system) to implement membranes, confinement, and replayable deterministic computation.
Descriptor capabilities only work within a single OS/VM (or set of mutually-reliant ones), though; to communicate across an unreliable network you need to use crypto-caps. (It is possible to map descriptor-caps to crypto-caps at the boundaries of the system; E is an example of this.)
Separately:
In my opinion, what capabilities are most useful for is not to replace what we're doing with ACLs (or Unix permissions or same-origin policies (gack)) today (though that should happen as well), but to protect users from the programs they run. (Which was mostly not a concern in the early days of computing, which is how we got in the mess we're in.)
I want to see an end to viruses, trojans, "malware" of all sorts — at least in the forms that can do damage to your whole computer and spread themselves (as opposed to just misbehaving at the task which you set them).
I want to see an end to having to trust every programmer you run a program from with your whole computer.
(I could say more, but this is long enough already...)
Kevin Reid @ 66
Thanks for your elaborations of the skimpy definitions I wrote.
Descriptor capabilities only work within a single OS/VM (or set of mutually-reliant ones), though; to communicate across an unreliable network you need to use crypto-caps. (It is possible to map descriptor-caps to crypto-caps at the boundaries of the system; E is an example of this.)
Yes, it's possible, but it can be problematic when the boundary between systems is unclear. For instance, where's the boundary in a distributed operating system? Not saying it can't be done, just that it has to be done very carefully.
The guy who taught me a lot of what I know about operating systems used to say that he was on a crusade to remove the very concept of "superuser" from the vocabulary of working engineers. Why, he used to ask, should there be anyone who has no limits on access or control in a computer system, when, as designers, we can control that access at any granularity we choose? It's as if you gave the air-conditioning maintenance person the keys to the panopticon, because he needs to get into all the ducts. Of course if you put it that way, you have to wonder why we need a panopticon.
Additional point about distributed systems and security: sometimes dealing with separation in time is even harder than dealing with separation in space. It's in principle impossible to build a global clock for a distributed system, so there are problems of synchronization, and it's impossible to have infinite memory for all states of all parts of the system, so you have to know what to remember, and what to do when someone references somehing you don't remember. Given unreliable communications, you have to have policies about what to do, for instance, if you get a mesasge that's out of sequence and older than your records of the original conversation stream. You can't just accept it as genuine, because you have to be suspicious of man-in-the-middle attacks.
One of the many reasons I work where I do is the IT policy:
"Here's a machine. It has Debian on it. Put whatever you want on it provided it's Free Software. Here's the configuration information. If you blow up your machine, you fix it. If you can't, we'll reimage corporate Debian. If you take down our network or otherwise cause damage, we disconnect your network connection at the switch, and when you've proved you've fixed the problem we'll let you back on. Have fun!"
Now, in order to do my job, one basically has to have been a sysadmin in a former life; obviously corporate IT policy for sales staff is somewhat different.
#7: Not everybody has an etherhub recovery dive logged. I do. No, that's not me, unfortunately.
#62: The problem is that for every 5 teachers like you that can tell their computer equivalent of arse from elbow, there's one "look at this cool toy" teacher that will cause more damage, more times, than what they'll save by the 5's higher efficiency. Plus, of course, schools get nailed hard if they run afoul of the Piracy Cops, and that one teacher in the school who doesn't understand every word of the 14-page Ethics Understanding and Law Authorization means that 30 computers don't get into the schools next year, or (more likely) the admin to support them has to be laid off for "budgetary restraint".
I grew up around teachers. They're brilliant, hard-working and dedicated. And sometimes a little too focussed.
And then there's the approach here: find a colleague who can fix your problem, and return the help in kind when they need it.
Comments containing more than seven URLs will be held for approval. If you want to comment on a thread that's been closed, please post to the most recent "Open Thread" discussion.
You can subscribe (via RSS) to this particular comment thread. (If this option is baffling, here's a quick introduction.)
HTML Tags:
<strong>Strong</strong> = Strong
<em>Emphasized</em> = Emphasized
<a href="http://www.url.com">Linked text</a> = Linked text
Spelling reference:
Tolkien. Minuscule. Gandhi. Millennium. Delany. Embarrassment. Publishers Weekly. Occurrence. Asimov. Weird. Connoisseur. Accommodate. Hierarchy. Deity. Etiquette. Pharaoh. Teresa. Its. Macdonald. Nielsen Hayden. It's. Fluorosphere. Barack. More here.