Moved

Moved. See https://slott56.github.io. All new content goes to the new site. This is a legacy, and will likely be dropped five years after the last post in Jan 2023.

Tuesday, November 29, 2011

The Value of Microsoft's Tools

See Andrew Binstock's "Windows 8: Microsoft's Development Re-Do".
The costs of these migrations has been enormous and continues to accumulate...
I can only rub my hands with glee and engage in shameless "I Told You So" self-congratulations.

Only you can prevent being held hostage by Microsoft.

More than once, I've observed that a strategy of using only proprietary tools would be expensive and complex.  And every time, the folks I was talking to trivialized my concerns as hardly worth considering.

I've seen orphaned software: it only compiles on an old version of Visual Studio.   I've seen software orphaned so badly that it can only be compiled on one creaky old PC.  The cost to convert was so astronomical that the customer preferred to hope for a product to arise somewhere in the marketplace.  When no suitable product appeared over the decades, the problem reached palpable Pants On Fire (POF) levels of panic.  All due to the hidden costs of Microsoft's tools.

I've even been told that VB is a terrible language, but Visual Studio makes it acceptable.

Thursday, November 24, 2011

Justification of Project Staffing

I really dislike being asked to plan a project.  It's hard to predict the future accurately.

In spite of the future being -- well -- the future, and utterly unknowable, we still have to have the following kinds of discussions.

Me: "It's probably going to take a team of six."

Customer: "We don't really have the budget for that.  You're going to have to provide a lot of justification for a team that big."

What's wrong with this picture?  Let's enumerate.
  1. Customer is paying me for my opinion based on my experience.  If they want to provide me with the answers, I have a way to save them a lot of money.  Write their own project plan with their own answers and leave me out of it.
  2. I've already provided all the justification there is.  I'm predicting the future here.  Software projects are not simple Rate-Time-Distance fourth-grade math problems.  They involve an unknown number of unknowns.  I can't provide a "lot" of justification because there isn't any indisputable basis for the prediction.
  3. I don't know the people. The customer -- typically -- hasn't hired them yet.  Since I don't know them, I don't know how "productive" they'll be.  They could hire a dozen n00bz who can't find their asses blindfolded even using both hands.  Or.  They could hire two singular geniuses who can knock the thing out in a weekend.  Or.  They could hire a half-dozen arrogant SOB's who refuse to follow my recommendations. 
  4. They're going to do whatever they want no matter what I say.  Seriously.  I could say "six".  They could argue that I should rewrite the plan to say "four" without changing the effort and duration.  Why ask me to change the plan?  A customer can only do what they know to be the right thing. 
Doing the Right Thing

Let's return to that last point.  A customer project manager can only do what they absolutely know is the right thing.  I can suggest all kinds of things.  If they're too new, too different, too disturbing, they're going to get ignored.

Indeed, since people have such a huge Confirmation Bias, it's very, very hard to introduce anything new.  A customer doesn't bring in consultants without having already sold the idea that a software development project is in the offing.  They justify spending a few thousand on consulting by establishing some overall, ball-park, big-picture budget and showing that the consulting fees are just a small fraction of the overall.

As consultants, we have to guess this overall, ball-park, big-picture budget accurately, or the project will be shut down.  If we guess too high, then the budget is out of control, or the scope isn't well-enough defined, or some other smell will stop all progress.  If we guess too low, then we have to lard on additional work to get back to the original concept.

Architectures, components and techniques all have to meet expectations. A customer that isn't familiar with test drive development, for example, will have an endless supply of objections.  "It's unproven."  "We don't have the budget for all that testing."  "We're more comfortable with our existing process."

The final trump card is the passive aggressive "I'll have to see the detailed justification."  It means "Don't you dare."  But it sounds just like passive acceptance.

Since project managers can only do what they know is right, they'll find lots of ways of subverting the new and unfamiliar.

If they don't like the architecture, the first glitch or delay or problem will immediately lead to a change in direction to yank out the new and replace it with the familiar.

If they don't like a component, they'll find numerous great reasons to rework that part of the project to remove the offending component.

If they don't like a technique (e.g., Code Walk Throughs) they'll subvert it.  Either not schedule them.  Or cancel them because there are "more important things to do."  Or interrupt them to pull people out of them.

Overcoming the Confirmation Bias

I find the process of overcoming the confirmation bias to be tedious.  Some people like the one-on-one "influencing" role.  It takes patience and time to overcome the confirmation bias so that the customer is open to new ideas.  I just don't have the patience.  It's too much work to listen patiently to all the objections and slowly work through all the alternatives.

I've worked with folks who really relish this kind of thing.  Endless one-on-one meetings.  Lots of pre-meetings and post-meetings and reviews of drafts.  I suppose it's rewarding.  Sigh.

Tuesday, November 22, 2011

How to Learn

A recent question.
i came up with two options.
 1.  building skills 1 (+ other references)... then algorithms & data
structures.... then your books 2 & 3

or

 2.  your three books 1,2 & 3... then algo & ds

kindly help me decide so i can start soon. 

I have two pieces of advice.

First.  Programming is a language skill.  Just like English.  If you can't get the English right, the odds of getting Python, Java, HTML or SQL right is considerably reduced.  Please, please, please take more care in grammar, syntax and punctuation.  Otherwise, your future as a programmer doesn't look very good.  For example, the personal pronoun is spelled "I".  In the 20th century, we spell out "and"; we stopped writing "&" as a stand-in for the Latin "et" centuries ago.  Also, ellipses ("...") shouldn't be used except when eliding part of a quote.  Clarity and precision actually matter.

Second, and more relevant, your two choices don't really amount to a significant difference.  If you're waiting around for advice, you're wasting your time.  Both sequences are good ideas. It's more important to get started than it is to carefully choose the precise and exact course of study. Just start doing something immediately.

Learning to program is a life-long exercise. There will always be more to learn. Start as soon as you can. The exact choices don't matter.  Why?  Because, eventually, you'll read all of those books plus  many, many others.

Spend less time waiting for advice and more time studying.





Thursday, November 17, 2011

More On Inheritance vs. Delegation

Emphasis on the "More On" as in "Moron".  This is a standard design error story.  The issue is that inheritance happens along an "axis" or "dimension" where the subclasses are at different points along that axis.  Multi-dimensional inheritance is an EPIC FAIL.

Context

Data warehouse processing can involve a fair amount of "big batch" programs.  Loading 40,000 rows of econometric data in a single swoop, updating dimensions and loading facts, for example. 

When you get data from customers and vendors, you have endless file-format problems.  To assure that things will work, each of these big batch programs has at least two operating modes.
  • Validate.  Go through all the motions.  Except.  Don't commit any changes to the database; don't make any filesystem changes.  (i.e., write the new files, but don't do the final renames to make the files current.)
  • Load.  Go through all the motions including a complete commit to the database and any filesystem changes.
Problem

What's the difference between the two modes?  Clearly, one is a subclass of the other.
  • Load can be the superclass.  The Validate subclass simply replaces the save methods stubs that do nothing.
  • Validate can be the superclass.  The Load subclass simply implements the save method stubs with methods that do something useful.
Simple, right?

Wrong.

What Doesn't Work

This design has a smell.  The smell is that we can't easily extend the overall processing to include an additional feature. 

Why not? 

This design has the persistence feature set as the inheritance axis or dimension.  This is kind of limited.  We really want a different feature set for inheritance.

Consider a Validate for two dimensions (Company and Time) that loads econometric facts.  It has stub "save" methods.

We subclass the Validate to create the proper Load for these two dimensions and one fact.  We replace the stub save methods with proper database commits. 

After the actuaries think for a while, suddenly we have a file which includes an additional dimension (i.e., business location) or an additional fact (i.e., econometric data at a different level of granularity).  What now?  If we subclass Validate to add the dimension or fact, we have a problem.  We have to repeat the Load subclass methods for the new, extended Load.  Oops.

If we subclass Load to add the dimension or fact, we have a problem.  We have to repeat the Validate stubs in the new extended Load to make it into a Validate.  Oops.

Recognizing Delegation

It's difficult to predict inheritance vs. delegation design problems.

The hand-waving advice is to consider the essential features of the object.  This isn't too helpful.  Often, we're so focused on the database design that persistence seems essential.

Experience shows, however, that some things are not essential.  Persistence, for example, is one of those things that should always be delegated.

Another thing that should always be delegated is the more general problem of representation: JSON, XML, etc., should rely on delegation since this is never essential.  There's always another representation for data.  Representation is always independent of the object's essential internal state changes.

Consequence

In my case, I've got about a dozen implementations using a clunky inheritance that had some copy-and-paste programming.  Oops.

I'm trying to reduce that technical debt by rewriting each to be a proper delegation.  With good unit test coverage, there's no real technical risk.  Just tedious fixing the same mistake that I rushed into production twelve separate times. 

Really.  Colossally dumb.