Thursday, July 7, 2016

Curiosity over Judgment: getting past defensiveness.


Say someone wants some aspect of my behavior to change: dress, schedule, food, drink, speech, whatever.

I initially get a little upset, defensive. But that's not productive. That's just anger and ego protection.

Is there a more excellent way?

I'm told that the secret to success is to replace judgment ('that's none of your business", "you are trying to control me", "screw off, I do what I want", "that's not fair") with curiosity.

"replace judgment with curiosity"

Curiosity. Hmmm. Let's try that. I wonder where that will lead.

I'll have to suspend my anger and judgment and blame and quit assigning motives to others if I'm going to process this with curiosity.

So much is possible now, and there are so many questions:

So, why is it so important to you how I behave?
  • Is it for a purpose? 
  • Does it serve you in some way? 
  • Is it because you care about me and think it will help me? 
  • Is it so that we'll conform to the same standard? 
  • Is it because without the change I am missing opportunities? 
  • Is it for our mutual good? Only mine? Neither? 
And then -- since I got defensive -- there are other questions for me to answer first:
  • Why is this behavior so important to me? 
  • Why should I cling to this behavior?
  • Is this behavior symbolic to me? Of what? 
  • Is this behavior central to my purpose and place in the world?
  • Has my current behavior limited me or expanded my opportunities and enjoyment of life? 
  • Is it really "just something I do"?
  • Is the new behavior really detrimental to me? 
And then... what is best?

If I take the position that I should want what is best for me, what builds the future and the person and family that I value so much:
"what do I really want?"


Now I can make a decision with clarity, intention, values, and purpose.
I'm free.
I have power over my life.
I can choose.

I don't even have to make a permanent decision for life, I can choose for now and revisit later when I know more about how this change affects me, you, and us.

"what does this mean to you?"

But of course, to get past judgment and to an end, both people are probably going to have to have this conversation -- in curiosity and not in blame or judgment.

I wish the world worked this way more often.

Friday, July 1, 2016

What is "Agile"?

Most of agile is about delivering sooner and a lot more often, so that you can incorporate the client's desires and feedback into your daily work. 

The rest is about how to make that sustainable for development teams.

Monday, April 4, 2016

Naming is still hard.

Ideally, we would have excellent and obvious names for all variables, classes, packages, methods, etc. But it's hard to be excellent all the time. People building the java libraries struggled with names, I can tell, and though I don't always appreciate their choices I recognize that this is a multifaceted problem.

Maybe a moment of agonizing over a bad name might help share a mental model, and some patterns or smells for naming methods.

Let's give it a shot.

A Tiny Example

Here is a tiny, dull, dumb snippet:
 Date now = new Date();
 DateFormat df = DateFormat.getDateInstance(LONG, Locale.FRENCH);
 System.out.println(df.format(now));

Java has a method called getDateInstance().

Does it return an instance of a date? No, it does not.

If it lived in a package called Date then it would be a terribly wrong name, but it doesn't live there.

It returns an instance of a date formatter, whose class is called DateFormat.

DateFormat doesn't seem like such a good name, because it is an object with methods format. A DateFormat with a format method seems odd and redundant. Is it one, or does it make one, or what?But let's not worry about that for a minute.

It lives in DateFormat, so technically its name is DateFormat.getDateInstance.  That's better, but it's possibly both misleading (not a date instance) and also redundant ('Date' appearing both in package and method name).  Would DateFormat.getInstance() be a better name?

Well, that depends on how you import it. If you import DateFormat, then DateFormat.getInstance() seems harmless enough, but you might import DateFormat.getInstance() and then the use of the method appears without needed context:

      Console.out.println(getInstance(LONG, Locale.US).format(now)); 

Instance of a LONG? Instance of an US? Should the name be getInstanceOfClassDateFormat? Ew. GetInstanceOfFormatter?

Well, the call to format seems to help give context so we know more about it, and we can hover over the getInstance() call in an IDE to help us see where it comes from. It is harmless and survivable.

But it doesn't seem excellent.

Identifying a Noise Word


Trying to finesse or expand the word Instance is not productive here.

The problem with Instance is that it is a noise word.  It is like Data, Manager, Information, and so many other space-consuming bits of non-meaning we often assign to variables and classes.

So maybe the question that helps with naming is to ask again why this function exists.

It seems to me that it exists to provide the date formatter that the user requested from among the many date formats that may exist in different locales.

That suggests that the name should probably be something like getLocaleSpecificDateFormatter, but that's a real handful to type and most of the interesting words are near the end. Eww.

Perhaps getLocaleFormatter() is sufficient, since it's in the DateFormat package to begin with.

I prefer that, but I'm not crazy about it. I don't even like starting with 'get', which leads me to ...

Working with the Audience


I don't care for the getter/setter standard in Java. I would prefer to see something like DateFormat.for(LONG, Locale.FRENCH).  However, I have to balance that preference with the idioms and habits of the Java community.

A programmer working in an IDE will type the word get, and then ask the IDE for completion. That's a powerfully useful habit in Java IDEs, and it behooves us to comply.  Therefore 'get' is mandatory.

Now the next-most important thing is the word following get, because the programmer needs to quickly select the method they want from the completion list.

The next important word seems to be Locale. Let's make that word #2.

Now we see DateFormat.getLocale -- and that's still misleading. We don't really want a locale object/package/class here, but rather an object that will format a date for us. Drats.

Do we drop "Locale" from the name elaborate further?

It seems that Locale is important, so we don't want to move it later in the name or drop it entirely, so we'll elaborate a little further to see where it takes us.

So if we're not getting a locale, but a formatter, let's append the word formatter.

We suck up the redundancy issue. DateFormat.getLocaleFormatter() seems like the best we can do without agonizing over this for weeks.

Looking at the example of usage (in a test, of course) I see something like this:
DateFormatter df = getLocaleFormatter(LONG, Locale.FRENCH);
String formattedDate = df.format(now);
This seems to reveal intent so much better than getInstance().

Of course, the java libraries are in wide use and people have already formed habits and programs that would break if we renamed the library method now. So, we don't do anything about it, and go back to calling getInstance() while gritting our teeth.

Oh, look. The variable df has a horribly bad name. I bet I could fix that...


What do we learn from this exercise?


In this (perhaps silly) example, we examined a name that we cannot change, but along the way  you and I have examined:

  • the direct honesty of a name (and found it wanting)
  • the context of the name (the package and usage)
  • the existence of noise words in the name
  • the way in which other programmers use names
  • the idioms of the programming language
  • using a compound name to progressively reveal the intent of the method
  • the difficulty of changing APIs whose users are unknown to us
  • the fact that naming is an ongoing battle (see remark about df, above)

I welcome your comments and criticisms.

Thursday, February 18, 2016

Sarah and Joe Write Features

I wrote a question for the twitterverse:




Give that question a moment's thought and tell me what you think as a knee-jerk reaction.


About Joe and Sarah



Of course Joe and Sarah are fictional characters.

No this isn't a gender question. I thought it would be wrong of me to pick two female names or two male names. I figured that there is really no way that I am going to get out of this without answering to questions (or charges) of gender bias no matter what I do, and so I decided to pick two names roughly at random.

It didn't dawn on me that they're both European/American kinds of names until later -- so bias is probably real there. But what if one was an Indian name and the other Chinese? I can see that having all kinds of issues too. So I have two names, and please forgive me for not being able to include everyone in that. If it makes you better, put "Li" at the end of one and "Kuthrupali" at the end of the other.


The Conversation


But I picked this question because it tickles the right parts of the brain to make an interesting exchange about value and ability and respect.
Tom Eskeli picked up on it quickly.

No. Joe's program is more valuable to the company, but that does not mean he is the better programmer.
I tried to (good-naturedly, I promise) push him into a bit of a corner, but he wisely held his ground.

Michael Bolton also spoke up:

No. To say Yes would be to confuse value with revenue. For instance: Joe's feature may have needed Sarah's to work.
To which George Dinwiddie replied:
Joe's feature may increase margin from existing customers, while Sarah's may open up a new market. 
M. Bolton:
Yes; when was revenue evaluated? 
At this point I considered the party to be in full swing.

The discussion was suddenly one of systems and short- v. long-term measurement and interconnectedness of parts. It became rich.

In the richness of the conversation, some angles were uncovered that I'd not considered when I phrased it.  You see, I'm not a puppet master controlling the stage and giving people roles to act. I ask a question because I think it's interesting fodder for thinking. People volunteer their depth and experience and wisdom.

The Punch Line



Finally I had to give away my punch line, which is that in the initial question I stated that Joe and Sarah had been assigned their work.

They had no say in what the feature was and no control over how much money it would make. To assume value from participation in assigned work based on the value the work is perceived to have generated really does not reflect on the programmers.

Joe may have been a brilliant programmer or a lousy hack. He may have TDD-ed or not. He may have taken weeks or hours to write it. It may have had no bugs or dozens. Nothing is said about his process or the metrics and observations about the story's trip from concept to deployment.

Likewise, nothing here gives us any insight whatsoever into the efficiency, quality, or effectiveness of Sarah's work.

Those who are disciples or casual readers of Deming will realize that I have given a version of the "Red Bead Experiment" and found the twitterverse ready. Deming taught us something crucial about compensation/recognition within systems:

NEVER REWARD OR PUNISH RANDOMNESS
Excellent answers Michael, George, and Tom.

If you don't know these people, do look them up and give them a follow.


Wednesday, February 3, 2016

Naming good is what Position?

I want to drop in a quick note about variable and class naming, one of my favorite hobby horses.

I talked a little bit about choosing more completable names already. I think that is very important, and you would be well-served by paying attention to that bit.

Today let's consider the serial position effect.

In any programming namespace, we're going to come across some naming issues. Here is a non-comprehensive, non-exclusive list:

  • Objects with a common base will want to include the base class name in the derived class, leaving us with MonoNUnitTestRunner derived from NUnitTestRunner derived from TestRunner, possibly breaking naming in its derivation from CodeEvaluationRunner and on and on.
  • Objects in languages with declared interfaces have a tendency to want to declare the interfaces in their names. An iUserAction will have a UserActionImpl in many cases, whose children may follow the prior rule above. 
  • People love naming the patterns they use in implementing an interface. You will see names like UserActionVisitorFactory 
  • Noise words will creep in, so that you may have a Subscriber class with a base class tied to persistence. You can't call them both Subscriber unless you split them into separate namespaces, so you'll tend to name one SubscriberInfo or SubscriberDataObject or the like. Not to mention SubscriberObject and SubscriberManager and BaseSubscriber. Of course "manager", and "info" and "data" and "object" are completely noise words in an OO system, but they sure are prevalent. 
  • Fuzziness and generality will be signaled by the use of vague terms. Identifier is general, FactoryManager is general, pattern-related (possibly) and vague.  Often when we don't have a clear, unique purpose for an object, we'll use vague terms.  It might be a code smell. It means that other, similar terms will have to carry "warts" or "qualifications" to be unique. Maybe there must be a UniqueShortStringId so that we don't confuse it with the UUID or GUID we use in Id. 
  • People suspicious of the 'magic' of namespaces may feel uncomfortable with letting the namespaces and class names be sorted out by the compiler (for some reason) and may embed namespaces and class names in their object and method names. I know, it's redundant and seems silly, but it happens. We end up with CustomerManagement.Customer.getCustomerName(). Sigh.
Is it wrong to have compound names? Probably not, and besides "wrong" is an uninteresting qualifier. Maybe better questions will help:
  • Is it helpful? 
  • s it only a coping mechanism for over-used namespaces? 
  • Is it a signal that our code is degrading? 
  • Is it a sign that we're doing a good job? 
  • Is it a temporary state, that might be leading us toward more manageable code? 

Rather than answer ethical questions that we don't have room for in this space, let's consider instead that we have compound names, for better or worse.

What ordering should we use?

Serial position effect tells us that we will recall the first and last parts of the name most clearly. This has been illustrated in even smaller scale by silly facebook posts like this:
"If you can raed tihs, tehn you hvae a vrey rrae gfit. Olny sveen penecrt of plpoe are albe to usdtnernad tihs txet. You hvae an azimang bairn!"
Which of course, contains misinformation because just about anyone can read it if they look at it for a few seconds and can read English at all.

The interesting bit though, is that it is true that you don't really struggle that much when the middles are scrambled because your brain doesn't really latch onto the middles.

Maybe this is why it's easier reading these really short paragraphs than the long ones above and the short bullet points instead of the longer ones.

It seems that if we are going to use long compound names, we might be better off if we:

  • Put the most important word at the front to support dot-programming.
  • Put the next most important word at the end of the name, so it's memorable and recognizable.
  • Either delete the other parts because they're noise, or bury them in the middle.
Give it a shot today when you're programming. It might lead you to more clear and easily-read names!



Wednesday, December 9, 2015

The Productivity Formula

Productivity is clearly some kind of ratio; it is some kind of N:M relationship, but it's hard to know quite what the N and M are.

Of all the poor ways one may define Productivity for software developers, there are some really horrible formulae including "lines of code per developer" and "story points per iteration", measures which really measure all the wrong things and which might send Charles Goodhart into a tizzy.


Rather than rant more on those, let's cut to the chase and give the definition that most people really use:


(what you did)

----------------------------

(what I wanted)


You see, management is really not a data science. It is generally practiced as a semi-educated gut feel, because most managers in software organizations are really programmers risen through the ranks. Maybe 1/3 of them have studied computer science (judging by the popularly quoted folk-statistic that 30% of people work in the area of their college major).  Most of them have learned to manage in a sink-or-swim way, or possibly mentored in a "tribal knowledge" kind of way.

As a result, most decisions are based on trust and gut feel. 

This measure is a gut-feel relation. As a gut-feel relation, we really should adjust it to say "what I think you did" on top, but I'm leaving it as-is. 

The thing that might confuse some readers is that this is not a snarky rant. I believe that this relation is not only "true" but possibly "correct" for all of the subjectivity that is inherent.

Whether it is right or not, it seems to be true, and gives us some leverage and predictive power. In other words, however imperfect and undesirable you may think it, this is a workable definition.

I have long talked about the "trust transaction" where we give transparency and delivery, and in return are given trust (generally in the form of lessened governance and increased autonomy).  Notice how this addresses the relationship.

I am not the first, nor even the brightest, to have talked about "expectations management" and "business and development working together" which improves the relation by "truing up" both entities.

Nor am I the first nor brightest to discuss the idea of delivering earlier and oftener. Notice that this helps to calibrate not only expectations but also observations.

And finally, note how not having anything to show for your past N days worth of effort affects the relation.

I offer it here for your comment, acceptance, absolute rejection, or ridicule -- as you wish.

Friday, November 13, 2015

Plate-Emptying



Over at Reddit there is a discussion about my Stop Per-Person Swimlanes article at Industrial Logic.

This surfaced what seems to be a common misunderstanding.



Some people think that the board was designed to facilitate public shaming.

It's to motivate members in a team to work harder. When your personal progress and responsibilities are put on a clearly visible chart somewhere it's a bit different than a sea of todo's some of which are labeled with your name.
 There are a lot of answers, agreeing or disagreeing (mostly disagreeing).  This one stands out:

It's different. Whether it's a good thing or a bad thing is EXTREMELY unclear. 
My take is that in most corporate environments that is a BAD thing. Why? Because, rather than thinking about the overall health of the code, coder's main concern is getting that ticket closed [emphasis mine - tro]. 
Which is a very straight path to the quick codebase degradation. 
Meanwhile, your velocity is high and management is happy while the project is quickly running into the ground.

This brings us to our topic of the day: Plate-Emptying.


Charlie developed a technique for getting the veggies off of his plate.


Plate Emptying - However silly it sounds...

Plate-emptying describes any behaviors which aim to increase performance in the small by minimizing the touch-time for each subtask performed.

I have witnessed it wherever there is pressure put on people to "finish quickly,"  or to "increase velocity" without any sort of work off-loading. It's especially visible where issues of quality are downplayed compared to speed task completion.

A pathological made-up story:
Imagine your dentist's office is running behind. The oral hygienist skips cleaning your teeth, gives you the bill for work not done, and sends you home.  Pretty soon the office is back on schedule and has more billed time on the books than usual!
Imagine that the dentist who heads the practice gives that hygienist a raise, and berates the others for being comparatively unproductive.

Yes, that's absurd. But stay with me here, because many software companies operate by rewarding those who give their work short shrift and declare early completion for work partly done. Often they drive out developers who are more complete and disciplined in their work.

But... this is SOFTWARE!!!

Yes, I know that software development is rumored to be a low-stress occupation, pleasant and quiet, with good rates and plentiful employment. That is not how many corporate programmers experience it. In fact, their lives seem to be full of death marches, which are causing mental health issues with plentiful stress.

Managers are in a horrible crunch most of the time. Big customers demand, and therefore are promised, a lot of work.

In software, everything is ASAP (or presented as if it is, to keep "motivation" high).

True Story: 
Managers had a significant defect backlog (tens of thousands) and were talking about two teams. 
One team is methodical, and sometimes could take a day or two to fix a defect, but they were always truly fixed and passed through testing and to deployment without any issues. 
The other team emptied their plate quickly usually in less than a half-day, but often had to revisit the same bug several times because their fixes didn't always really work or sometimes had troubling side effects.   
After deliberation, the managers decided to give the lion's share of bugs to the latter team because "they get things done faster." 

I suppose "done" in this case refers to plate-emptying, not full completion and deployment. All the numbers showed that tasks given to the "fast" team took considerably longer to deliver due to the rework involved, once you count in the round-trips to QA and the rework.  But they only looked at micro-efficiency, not at the throughput of the whole system.

Another true story, titled "Heisenberg Developers" was posted and came to my attention via twitter.

If we use an industrial era kind of thinking, then we are limited by the speed of the slowest component in the chain, and since apparently software is the work of people, it is obvious that we need all the people to do their work faster.

Surely, if everyone works faster, the whole line will be moving at maximum speed, and therefore the product will be done as soon as possible (in software, everything is ASAP!).

The clear, simple, obvious answer, then is to get each person to do their fastest work. We will hold them to a short turn-around and reward them for reducing their time budget for the work being done.

And since there is too much work,  we need to start it as soon as possible by cramming it all into a queue until development staff (coders, testers, etc) are split between several (sometimes dozens) of tasks. They will "just have to multitask."

What could go wrong?



For every complex problem there is an answer that is clear, simple, and wrong. - H. L. Mencken


Note how this theory of "go faster by rushing everyone" avoids looking at workflow and queues and Little's law, and all the things we've learned about multitasking and cognitive function, and the basic principles of knowledge work and Lean management. This is conveniently, obviously, simply, horribly wrong thinking here.

Nobody was counting the system's time, only the worker's time.

Nobody is counting the wait states, only the touch-time.

Lean thinking teaches us to watch lead time, and to understand the system. That keeps our optimization processes focused on actual bottlenecks and constraints.

However, functional managers (over programming, over testing, over documentation, etc) don't have authority or influence over the entire system, and tend to focus on their realm of influence -- their specific function. I can hardly blame them.

So what happens?


When a designer has too much work on his plate, he can always "abandon as done" when he has a rough idea, a couple of sketches, a wireframe, a spreadsheet, or maybe a proof of concept in powerpoint or visual basic. That's close enough, and we need to get this off his plate (and onto someone else's). Sometimes it's "close enough" and developers will figure it out but if not, it will come back from UX or Acceptance testing and can be handled then. If it gets out the door and customers don't like it, we can handle it with customer service and fix it in next release (taking it off the designer's plate for weeks or maybe months).

An architect's best tool for plate-emptying is to just say "no," since it is faster than evaluating an alternative, and after all there is so much to do.  Sadly, the architect is usually valued for how many things she has in process at once, rather than for her rate of completion or overall impact on the project. It hardly makes sense to defer, delegate, or refuse when one is being graded on how busy one is.

A developer will probably short quality and avoid deep thinking about side effects and consequences, and avoid meetings and discussions by making guesses. After all, it has to go through testing anyway and they'll send back any code that doesn't really work (well enough).

There isn't time to spend on this feature, so she should not bother with refactoring and test-driving and pairing or reviewing. Those things feel like "taking time for quality" and she is being rewarded (reputation-wise at least) for turning things around quickly; emptying her plate.

A tester will probably have to decide which tests he will not run this time, so that he can move forward quickly. Perhaps he can downgrade the severity of some bugs. Returning it to the developer will get it off his plate fastest, and is safer because testers tend to be blamed for every defect not caught.

Of course, there may be bug teams.  My "true story" above describes how that works.

Let's not get into operations. There is no place that plate-emptying stress is higher, and tolerance for errors is lower. You've got to love ops people, because everything is an emergency and nothing is appreciated, and they still show up. In fact, they show up at insane hours and weekends and holidays and such. They miss the kids' little league games and family birthdays. They work Christmas eve. Give your operations team some love; do something to make their job easier.

Customer support is held to tight schedules. How can they empty their plate faster? What if they pass a client off to some other representative? What if they just give some stock advice and abandon the user? What if they just recommend "reboot and call back"?  We have all seen instances of tech support people held to a standard of "45 seconds per call" so that they empty the queue rather than serving the customers.

Managers have to deal with plate emptying, too. They have to manage people less and shuffle lists and calendars more in order to keep up with the activity, all the while pushing to increase that very burden. Sometimes people-centric mentors who make it up the ranks are stuck being "meeting denizens" instead of really managing projects and people the way they always wanted to.

Most everyone came to the industry intending to do good work well and proudly. But organizations that value plate-emptying and busyness can destroy pride in workmanship by demanding brute-force solutions and intense activity from employees while ignoring larger system problems (such as the turbulent work flows and deferring learning) which actually prevent building and shipping working software.


What About Agile?

When agile goes well, it makes good workmanship (craftsmanship) and good decision-making possible.  It reduces stress and rework for everyone involved, and has a greater solidarity across roles due to the use of cross-functional teams.

Ideally, agile methods make it possible for teams to work at a sustainable pace, and focus on value rather than quantity. There is focus on economy of scope rather than economy of scale.

In particular, modern agile is built on safety, continuous definition and delivery of value, and continuous learning. Perhaps this is the way it was always supposed to be.

BUT, what we tend to see in many organizations is a fascination with velocity.  When teams have pressure to not only take on the maximum workload but also to complete it within a too-short fixed period, they become exhausted and frustrated, and are driven to obsessive plate-emptying behaviors.

This is all about the stress-performance curve at that point. Developers are stressed and exhausted past the point where they have good cognitive function, and plate-emptying is all that is really left for them.

Isn't That A Good Thing?

Yes, I realize that "the perfect is the enemy of the good" and that people really don't have time for infinite gold-plating and "cadillac" answers. Who can afford perfect code?

But no legitimate business can survive, having adopted "quick and dirty" as its standard of excellence.

Adding sufficient design and reasonable automated tests, software is transformed until most changes become reasonably easy and quickly-performed. Not all changes become "easy" but they become easier than they would be had the code been allowed to decline into a comical and tragic mess.

Many times plate-emptying delays release of useful software by weeks or months.

And, thanks to Singh's Inevitable Avoidance, companies whose developers, testers, customer service, and operations have been playing "hot potato" with defects for months or years have developed much more efficient ways of playing hot potato, and have even come up with metrics to show how efficiently they are improving the speed of hand-offs.

Often focusing on "minimal touch time" creates unproductive (even counter-productive) pressures, which in turn result in release disasters, interpersonal conflict, and loss of staff.

Despite what you might read sometimes, being a software developer in some companies is disturbingly stressful to the point of causing stress-related illnesses and panic attacks up to six months later.

I suspect (and observe) that the reason behind poor craftsmanship, poor support, and poor service is often pressure for individual plate-emptying rather cooperative completion of tasks. This pressure often results in practices like "rank-and-yank" (AKA "hunger games") where staff are retained or released based on speed of "productivity" measured in speed of plate-emptying.

Should We Bust Those Lazy Plate-Emptiers?

Blaming and shaming is generally useless. In this case it is also unfair. It is wrong to demand people do a thing, reward them for it, and then blame them for not resisting the system and refusing the rewards.

When I toured around the US with the "perfect day" exercise, I found that knowledge workers are almost never slackers. Their ambition, almost universally is to take on a problem that is hard (for them) and to work it to completion. A perfect day isn't "goofing off and playing video games at work" for them. It's doing good work and being respected for it.

So what's with the plate-emptying? What does it mean?

People are doing what they are doing because the system demands it and because they are too fatigued to fight it.

If you don't want people to abandon work for others to deal with, you have to give them enough time to work something to completion instead of racing through it to "get it off their plate."  In doing the work, they must be encouraged to find ways to make doing a complete job better, faster, and more consistent. There isn't always such a way, but help people build habits that take a lot of stress and decision-making out of the equation.

It is hard for some to believe, but if you want next month's work to be faster, you have to spend a bit of this month improving the code that the developers will start with next month.

Deming said long ago that all anyone wants is to take pride in their work. We need work systems that respect the human limits of the people who do the work, and we must not solve intellectual problems with brute force (more hours, more people, more typing).

Alternatively, most programmers I know have had a job in the past where pressure and plate-emptying were common practice, and sometimes these practices are carried into new job situations where plate-emptying behaviors are unnecessary and unwanted.

It's good to reflect on where you are, and what is most important to your customers.