close

Moved

Moved. See https://slott56.github.io. All new content goes to the new site. This is a legacy, and will likely be dropped five years after the last post in Jan 2023.

Showing posts with label innovation. Show all posts
Showing posts with label innovation. Show all posts

Tuesday, November 24, 2015

Coding Camp vs. Computer Science


Step 1, read this: "Dear GeekWire: A coding bootcamp is not a replacement for a computer science degree".   It's short, it won't hurt.

I got this comment.

"The world runs in legacy code and the cs degrees focus on leading edge 
Most of what is learned in cs [is] never used in the mainstream of business 
Much of computer work is repetitive and uninviting to upwardly mobile people who generally are moving up not improving the breed"

I disagree.  A lot.

"The world runs in legacy code." First, this is reductionist: everything that's been pushed to GitHub is now a "legacy". 
  • Does "legacy" mean "old, bad code?" If so, only CS grads will be equipped to make that judgement. 
  • Does "legacy" mean "COBOL?" If so, only CS grads will be able to articulate the problems with COBOL and make a rational plan to replace it with Microservices. 
  • Does "legacy" mean "not very interesting?" We'll return to this.
"CS degrees focus on leading edge." Not really true at all. The foundations of CS: data structures and algorithms, logic, and computability, haven't changed much since the days of Alan Turing and John von Neumann. They're highly relevant and form the core of a sensible curriculum.  

The "leading edge" would be some Java 1.8 nonsense or some Angular JS hokum. The kind of thing that comes and goes. The point of CS education is to make languages and language features just another thing, not something special and unique. A little CS background allows a programmer to lump all SQL databases into a broad category and deal with them sensibly. A Code Camp grad who only knows SQLite may have trouble seeing that Oracle is superficially different but fundamentally similar.

"cs is never used in the mainstream of business." True for some businesses. This is completely true for those businesses where "legacy" means "not very interesting." 

There is a great deal of not very interesting legacy code that fails to leverage a data structure more advanced than the flat file. This code is a liability, not an asset. The managers that let this happen probably didn't have a strong CS background and hired Code Camp graduates (because they're inexpensive) and created a huge pile of very bad code.

I've met these people and worked at these companies. It's a bad thing. The "leadership" that created such a huge pile of wasteful code needs to be fired. The "all that bad coded evolved during the 70's and 80's" isn't a very good excuse. A large amount of not interesting code can be replaced with a small amount of interesting code quickly and with almost zero risk.

Any company that's unable to pursue new lines of business because -- you know -- we've always done X and it's expensive to pivot to Y is deranged. They're merely holding onto their niche because they're paralyzed by fear of innovation=failure.

"Much of computer work is repetitive".  False. It's made repetitive by unimaginative management types who like to manage repetitive work. If you've done it twice, you need to be prepared to distinguish coincidence from pattern. When you've done it three times, that's a pattern, and you need to automate it. If you do it a fourth time, you're missing the opportunity to automate, wasting money instead of investing it.

"Much of computer work is ... uninviting to upwardly mobile people" Only in places where repetitive is permitted to exist.  If repetitive is not permitted, upward mobility will be the norm for the innovators.

"people who generally are moving up not improving the breed". I get this. The smart people move on. All we have left in this company are Code Camp graduates and their managers who value repetitive work and large volumes of not interesting code. 

Improving the Breed means what? 

Hiring CS graduates instead of Code Camp kiddies.




Thursday, July 10, 2014

The Permissions Issue

Why?

Why are Enterprise Computers so hard to use? What is it about computers that terrifies corporate IT?

They're paying lots of money to have me sit around and wait for mysterious approver folks to decide if I can be given permission to install development tools. (Of course, the real work is done by off-shore subcontractors who are (a) overworked and (b) simply reviewing a decision matrix.)

And they ask, "Are you getting everything you need?"

The answer is universally "No, I'm not getting what I need." Universally. But I can't say that.

You want me to develop software. And you simultaneously erect massive, institutional roadblocks to prevent me from developing software.

I have yet to work somewhere without roadblocks that effectively prevent development.

And I know that some vague "security considerations" trump any productive approach to doing software development. I know that there's really no point in trying to explain that I'm not making progress because I can't actually do anything. And you're stopping me from doing anything.

My first two weeks at every client:

The client tried to "expedite" my arrival by requesting the PC early, so it would be available on day 1. It wasn't. A temporary PC is -- of course -- useless. But that's the balance of days 1-5: piddling around with the temporary PC. That was ordered two weeks earlier.

Day 6 begins with the real PC. It's actually too small for serious development due to an oversight in bringing me on as a developer, but not ordering a developer's PC for me. I'll deal. Things will be slow. That's okay. Some day, you'll discover that I'm wasting time waiting for each build and unit test suite. Right now, I'm doing nothing, so I have no basis to complain.

Day 7 reveals that I need to fill in a form to have the PC you assigned me "unlocked." Without this, I cannot install any development tools.

In order to fill in the form, I need to run an in-house app. Which is known by several names, none of which appear on the intranet site. Day 8 is lost to searching, making some confused phone calls, and waiting for someone to get back to me with something.

Oh. And the email you sent on Day 9 had a broken link. That's not the in-house app anymore. It may have been in the past. But it's not.

Day 10 is looking good. The development request has been rejected because I -- as an outsider -- can't make the request to unlock a PC directly. It has to be made by someone who's away visiting customers or off-shore developers or something.

Remember. This is the two weeks I'm on site. The whole order started 10 business days earlier with the request for the wrong PC without appropriate developer permissions.

Thursday, July 12, 2012

Innovation, Arduino and "Tinkering"

Many of my customers (mostly super-large IT shops) wouldn't recognize innovative behavior.  Large organizations tend to punish defectors (folks that don't conform), and innovation is non-conforming.

I've just had two object lessons in innovation.  The state of the art has left many in-house IT development processes in the dust.  The cost and complexity of innovation has fallen, but organizations continue to lumber along pretending that innovation is risky or complex.

You can find endless advice on how to foster a culture of innovation.  Often, this advice includes a suggestion that innovative projects should somehow "fail fast".  I'm deeply suspicious of "fail fast" advice.  I think it misleads IT management into thinking there's a super-cheap way to innovate.  It's misleading because "fail fast" leaves too many questions unanswered.

  • How soon do you know something's about to be a failure?  
  • What's the deadline that applies so that failure can happen quickly?  
  • What's the leading indicator of failure?

If you are gifted enough to predict the future -- and can predict failure -- why not apply that gift to predicting success?  Give up on the silliness of unstructured "innovation" and simply implement what will not fail.

At MADExpo, I saw an eye-opening presentation on the Arduino.  I followed that with viewing Massimo Banzi's TED Talk on the subject of Arduino, imagination and open source.

There are two central parts of the Arduino philosophy.

  • Tinkering.
  • Interaction Design.

Background

Without delving too deeply, I'm trying to build a device that will measure the position of a hydraulic piston.  It's the hydraulic steering on a boat, and a measurement of the piston position provides the rudder position, something that's handy for adjusting sail trim to reduce the strain on an autopilot.

Clearly, such a device needs to be calibrated with the extreme port and starboard range of motion.  Barring unusual circumstances, the amidships position is simply the center between the two limits.

Part 1.  Buy an Arduino, a Sharp GP2Y0A02YK0F IR distance measurer (for 10-80 cm), plus miscellaneous things like breadboard, jumpers, LED's, test leads, etc.  A trip to Radio Shack covers most of the bases.  The rest comes from SparkfunRobot Shop, Digi-Key and Mouser.

Part 2.  Learn the language (a subset of C.)  Learn core algorithms (de-bouncing buttons and the IR sensor).

Tinkering

At this point, we've tinkered.  Heavily.

What's important for IT managers is that tinkering doesn't have a project plan.  It doesn't have a simple schedule and clear milestones.  It's a learning process.  It's knowledge acquisition.

The current replacement for tinkering is training.  Rather than learn by attempting (and failing), IT managers hire experts to pass on knowledge.  This is, generally, limiting and specifically stifles innovation.

Years ago, I worked on embedded systems: hardware software hybrids.  We burned ROMs and programmed in assembler.  Back in those days, this kind of tinkering was difficult, and consequently frowned upon.  It was difficult to specify, locate, source, and assemble the components.  There was a lot of reading complex product data sheets to try and determine what to buy and how few were needed.

What had once been a very serious (and very difficult) electrical engineering exercise (IR sensor, button, LED, power supply, etc., etc.) was a few days of tinkering with commodity parts.  The price was low enough and availability ubiquitous enough that frying a few LED's is of no real consequence.  Even frying an Arduino or two isn't much of a concern.

Interaction Design

The next step is to work out the user interface.  For the normal operating mode, the input comes from the hydraulic piston and the output is some LED's to show the displacement left or right of center.  Pretty simple.

However.

There's the issue of calibration.  Clearly, the left and right limits (as well as center position) need to be calibrated into the device.

Just as clearly, this means that the device needs buttons and LED's to switch from normal mode to calibration mode.  And it needs some careful interaction design.  There are several operating modes (uncalibrated, calibrating, normal) with several submodes for calibrating (setting left, setting right, setting center.)

Once upon a time, we wrote long, wordy documents.  We drew complex UML state charts.  We drew all kinds of pictures to try and capture the important features of the interaction.

Enter Arduino

The point of Arduino is not to spend too much time up front over-specifying something that's probably a bad idea.   The point is to experiment quickly with different user interface and interaction experiences to see what works and what doesn't work.

The same is true of many modern development environments.  Web development, for example, can be done by using sophisticated frameworks, writing little backend code and messing with the jQuery, CSS and HTML5 aspects of the interaction.

The scales fell from my eyes when I started to document the various operating modes.

Arduino doesn't have a great unit testing environment.  It's for tinkering, after all.  It's also for building small, focused things.  Not large, rambling, hyper-complex things.  You can achieve complexity through the interaction of small, easy-to-test things.  But don't start out with complexity.

After writing a few paragraphs, I realized that the piston movements could easily be self-calibrating.  Simply track the maximum and minimum distances ever seen.  That's it.  Nothing more.  In the case of a boat, it means swinging the wheel from stop to stop to define the operating range.  That's it.

A button (to clear the accumulated history) is still useful.  But much simpler since it's a one-time-only reset button.  Nothing more.

Moving from idea to working prototype took less time than writing this blog post.

Next steps are to tinker with various display alternatives.  How many LED's?  What colors?  LCD Text Display?  There are numerous choices.

Rather than wasting times on UML, specifications, whiteboard and diagrams, it's a simpler matter to write the user stories and tinker with display hardware.

Thursday, May 17, 2012

Flickr, Innovation and Integration

Read this on Gizmodo: "How Yahoo Killed Flickr and Lost the Internet"

Compelling stuff: "Integration Is The Enemy of Innovation".

"[Corporate Development milestones] often completely ignore what made the smaller target valuable in the first place."

Lessons learned: it's hard to apply structured, formal, financial controls to innovation.  As soon as the accountants show up, innovation will be stopped.  Someone has to champion the freedom to innovate in spite of being part of a profit-seeking corporation.

Tuesday, May 8, 2012

More of Disruptive Technology Change

There's a cool infographic on technology change in FrugalDad.  See The Great Disruption: The Future of Personal Tech.  It's interesting and informative, but the few predictions it makes are not really disruptive.  You wouldn't see anyone lobbying against the suggested future directions.  They're all good ideas that leverage existing technology.

On the other hand, there's a great graphic that shows how disruptive technology is labeled as illegal.  See Infographic: Why the movie industry is so wrong about SOPA.

Consider just one example.  Digital Movies.  The DVD was so frightening to movie producers (or distributors or theaters or the whole supply chain) that discussion of circumvention of DVD encoding had to be made illegal.  That kind of industry legislative action is evidence that a technology is truly disruptive.

Disruptive change will often lead to fearful rejection and legislative action.

"But wait," you say, "no one tried to make the iPod illegal."  Correct.   The iPod is not the core disruptive change.  Digital music is the disruptive change.  The iPod is just a vehicle.  Apple is making their money by providing a platform for digital content.

If you want to know what the Next Big Thing is, look to the US Congress.  Lobbyists are trying to make some things illegal merely because they're disruptive.

Universal Health Care, as one example, is being fought against.  There are lots of specious and farcical reasons being used to argue against simplifying the insurance mess that has emerged over the last few decades.  If Congress is fighting against it, that means the following:

1.  It's disruptive.  Game Changing.  Terrifying.
2.  The old school companies are spending huge lobbying and campaign budgets to prevent change.  They are unable to adapt to a different set of rules.
3.  Some new school companies stand to be wildly profitable if the change ever gets past the Congressional objections.

For another example, read this brilliant article: How Ma Bell Shelved the Future for 60 Years.  This an example of internal censorship of disruptive technology.  "More precisely, in Bell's imagination, the very knowledge that it was possible to record a conversation would "greatly restrict the use of the telephone," with catastrophic consequences for its business. Businessmen, for instance, the theory supposed, might fear the potential use of a recorded conversation to undo a written contract."

You know it's disruptive when it's actively feared.

Tuesday, March 27, 2012

Patents vs. Innovation

Read "Why Software Patents are Evil" by Simon Phipps in InfoWorld.
It's an excellent summary of the problems caused by patents applied to software.

There's a great TED Talk by Johanna Blakley on "Lessons Learned from Fashion's Free Culture" which reinforces the essential point.

Software patents don't help anyone.  The open source movement is evidence that folks working outside the constraints of patent lawyers are more innovative and produce high-quality software.  The Internet is built on non-proprietary technology (TCP/IP and related protocols), GNU/Linux, Apache and similar software componentry.  How have patents helped?

Thursday, March 22, 2012

Detailed Analysis of Disruptive Technology Change

Read this: Why I doubted Facebook could build a billion dollar business, and what I learned from being horribly wrong.

Don't be afraid to read it again.
when it comes to the exceptional cases, all bets are off. So keep your mind open to weird, young [ideas] that you meet that don’t fit the established pattern
Sound advice.  The best ideas are disruptive.  That means that the idea does not fit an established pattern.

The problem with being an architect is that software architecture is a political game.

In order to justify large projects with large funding, you must cater to the folks with money who (generally) feel that disruption == risk.  The idea of incremental effort and proofs of concept may not fly because they've decided that inappropriate incumbent technology is magically quicker than appropriate but novel technology.

There's a profound Software Process Improvement issue here.  Organizations can (and do) stifle innovation in an effort to "improve" their software development process.  The false hope is that an unchanging technology base is somehow helpful at making people more effective.

Even if you give people second-rate tools, you can eventually get to be pretty good at using them.  However.  Using better tools might be better than trying to get really good at using poor tools.

What I find endlessly funny are folks who want "formal research" or "studies" that prove that some new idea is actually better than existing ideas.  You can read Stack Overflow and programmers.stackexhcange.com questions looking for studies that prove the value of unit testing or prove the value of a NoSQL database or prove that software is simpler without triggers or stored procedures.

For the moment, these are disruptive ideas.

We know they're disruptive because people keep asking for proof.

When they stop asking for proof, you know the idea has finally "arrived" and it's time to move on to find the edge of the envelope again.

Tuesday, March 20, 2012

Innovation is Disruptive -- and sometimes forbidden

Saw this on Twitter from @hunterwalk:
Startups piss people off because their existence is a statement that incumbents aren't doing their job well enough
Also true of IT internal innovation.  Pitch a novel, innovative idea to management, and most organizations will find ways to avoid it.  Suggesting a bold new direction makes it look like someone isn't doing their job.

If  you want to see real push-back, try suggesting that the incumbent technology platform needs to be replaced.

As an example, consider an all-singing-all-dancing all VB shop.  The idea that C# might be better is met with a variety of responses.

  • It's too costly to change now.  We can't afford the training or the licenses or something.  The list is long and often includes silly costs based on a really bad adoption strategy.  A bad adoption plan allows someone to defend their incumbent technology.
  • It's too risky to change now.  What risks?  The list of risks is often surprising and frustrating.  My favorite is the blanket "We don't know what we don't know" risk statement.  That's designed to be a complete show-stopper because there's no evidence to counter it.
  • The new Visual Studio has features that make VB acceptable for development.  It's so important to keep the legacy technology that excuses can be made and work-arounds applied to preserve it.
As another example, consider replacing a 30-year old COBOL system.  As part of stalling an innovative plan, I've been told that the only scalable transaction-processing technology is COBOL-CICS-VSAM.  This was about five years ago, when the incumbency of COBOL might have seemed doubtful.  But to IT staff, the idea of Java was too innovative.  

The other problem was the innovative idea of a phased implementation.  Yes.  Agile thinking can be seen as disruptive to project managers; it can appear that they don't add much value.  The idea that we'd build "bridges" between legacy applications and new applications was so unpleasant that we had to spend a long time discussing the maintenance and support of throw-away code that existed just long enough to be sure that all the relevant COBOL had been rewritten.  

Bridges between old and new were portrayed as costly and risky.  These are the usual responses to a proposed new way of looking at the problem.  And, of course, a phased implementation was inherently low-value.  I've been told that a project was absolutely "all or nothing" and no piece had value separate from the complete scope.

Suggesting a change means that there's a problem, right?  It means their 30-year track record of COBOL support is less than perfect.  It means their ability to use VB is flawed in some way.   The only reason for a change is because -- somehow -- they have failed.  


Thursday, June 9, 2011

An Object-Lesson in How to Stifle Innovation

Read this: How Ma Bell Shelved the Future for 60 Years.
AT&T firmly believed that the answering machine, and its magnetic tapes, would lead the public to abandon the telephone.
How many good ideas are set aside by managers who simply don't have a clue what users actually want?

How many great IT projects are rejected because of this kind of delusional paranoia?

Thursday, March 10, 2011

To Robert Fulton, Regarding the "Steam Boat"

"What sir, would you make a ship sail against the wind and currents by lighting a bonfire under her deck? I pray you excuse me. I have no time to listen to such nonsense."

-- Napoleon Bonaparte
There's no authoritative source for this quote. Since Fulton was commissioned to build a submarine and did build a steam-powered boat in France, it's unlikely for this quote to be actually true.

A great list of related quotes: Famous Authoritative Pronouncements.

Thursday, September 16, 2010

What Innovation Looks Like

Check out "End User 2.0: When Employees Have All The Answers" in InformationWeek. This is about adoption of non-approved technology. Think iPad.

This shows what innovation looks like when it happens.

1. There's no process for innovation.

2. There's no "permission to fail". Folks just fail or succeed without anyone's support or permission.

3. It's disruptive. Many IT departments don't know how to cope with USB drives, iPads and related leading-edge technology. So these things are simply banned. (Ever walked past a sign that says "No Recording Devices Allowed Beyond This Point" with your iPhone?)

Here's one great quote: "Policies around regulatory compliance, reliability, budget approvals, and support all give IT teams reasons to resist technology driven by end users."

Technology innovation is happening. It is disruptive. Therefore, IT tends to resist the disruption.

The best stall tactic: "Security". If IT lifts up security as an issue, they can resist technology innovation effectively.

Other Disruptive Change

This happens everywhere. It isn't just the iPad. All disruptive, innovative change is met with serious resistance.

Agile Methods. Some IT departments resist agile methods because -- obviously -- the lack of a comprehensive and detailed project plan is a problem. Failure to plan is a plan for failure. The idea of building intentional flexibility into predicting the future is rejected. It's too disruptive to the IT chain of command to reduce the need for project managers.

Dynamic or Functional Programming Languages. It was painful to adopt Java (or C#). Adopting another, different language like Python is insanity. Obviously. Anyone in "Big IT" who is a serious Java or C# developer can tell you that a dynamic language is obviously unsuitable for production use. Mostly, the reasons boil down to "it's different"; different is too disruptive.

N0SQL Data Management. Clearly, the relational database is the only form of persistence that can possibly be used. It is perfect in every way. It can be used as a message queue (because adopting an actual message queue is too much work). It can be used for temporary or transient data. It can be used for non-relational objects like XML documents. Any suggestion that we use something other than a database is often met with derision. Clearly, a non-SQL database is disruptive to the orderly flow of data.

Simplified Architecture. [This is code for "No Stored Procedures".] Since stored procedures have been used with mixed success, some folks argue that they should be used more. On the other hand, it seems peculiar to me to intentionally fork application logic into two places -- application code and database. It seems to add complexity with no value. Lots of DBA's try to explain that some logic is "lower-level" or "more closely associated with the data" or "is a more 'essential' business rule." There's no dividing line here that makes stored procedures necessary or useful.

Try to prevent the problems associated with stored procedures and you will receive a $#!+-storm of abuse. Every time. Reducing the use of stored procedures is a disruptive change. An innovation. A bad thing.

[Want proof of the non-essential nature of stored procedures? Watch what happens when to upgrade or replace an application and migrate your data. Did you need the stored procedures? No, you left those behind. You only kept the data.]

Friday, April 9, 2010

iPad Thoughts -- Fashion Accessory?

From a Blog that's inside a company's firewall, so this had to be heavily edited.
"The instant ON is a relief. The full page touch screen works just like on the iPhone - only better. Web pages look great.. Photographs and Movies are fabulous. The screen resolution is fantastic. Sharing pictures makes it clear that the photo album is history. Tough times for Kindle. Email - much better than on the Blackberry. The things we like on the PDAs are all more attractive - and more usable! Almost like on a laptop."
Also.
"I did not have an easy way to view Excel & Powerpoint. 3G is not available for another month. ... No Adobe Flash. For some, the one big 'flaw' will be the lack of a 'file system'."
Finally, emphasis mine.
"The iPad is not a big leap, it is just a step, a big iTouch. But this is the last step that brings a whole new vision home. While not quite ready for Enterprise deployment, it gives us time to get going. And this may be the Tablet that makes it acceptable for men to carry handbags"
Okay. Time to start shopping for a nice Timbuktu messenger bag.

Sunday, September 20, 2009

Innovation and Outsourcing

Good stuff in ComputerWorld: Partnerships can Go Too Far.

"Consider vendor innovation. As companies become large and entrenched, they typically become more risk-averse and less creative, often rejecting ideas that challenge conventional wisdom."

This is really only half the story.

First Things First

Programming is hard -- really hard. Read EWD 316, chapter 2. By extension, most of IT is saddled with really, really complex and difficult problems.

"As a result of its extreme power, both the amount of information playing a role in the computations as well as the number of operations performed in the course of a computation, escape our unaided imagination by several orders of magnitude. Due to the limited size of our skull we are absolutely unable to visualize to any appreciable degree of detail what we are going to set in motion, and programming thereby comes an activity facing us with conceptual problems that have risen far, far above the original level of triviality."

Given that IT is hard, it therefore entails either some risk of failure or considerable cost to avoid failure. It also involves an -- often unknown -- amount of learning.

Before writing software, we really do need to learn the language, tools, architecture and components we're going to use. Not a 1-week introduction, but a real project with real quality reviews. Sometimes two projects are required to ferret out mistakes.

Also, before writing software, we really do need to understand the problem. Sadly, many business problems are workarounds to bad software. Leaving us with many alternative solutions that are all equally bad and don't address the root cause problem.

No-Value Features

Programmers will often pursue no-value features that are part of the language, tools, components or architecture. This drives up cost and risk for no value.

Business Short-Sightedness

The compounding problem is a short-sighted business impetus toward delivering something that mostly works as quickly as possible. Often, business folks buy into the no-value features, and overlook the real problem that we're supposed to be solving.

Sigh.

The Result: Stifling

The result of (a) inherent complexity, (b) no-value features and (c) short-sighted buyers is that IT management finds ways to stifle all IT innovation.

In effect, most companies outsource innovation. They hope that their vendors will provide something new, different and helpful. The IT organization isn't allowed to invest in the learning or take the risks necessary to innovate.

The ComputerWorld article points out that some companies then put Preferred Supplier Plans in place which further stifle innovation by narrowing the field of vendors to only the largest and least innovative.