So, you had a bad day.  Happens to the best of us.  Even so, you shouldn’t let it color your vision of the future.  Let’s take the example of legal technology applications.

When lawyers deign to upgrade their software, after much cajoling and a number of conversations over the rotary dial telephone, the whole traumatic experience often becomes a self-fulfilling prophecy of the most dire proportions.  Attorneys are generally ruthless when it comes to implementations; the operating thesis is this: if it’s not working right away, it ain’t gonna work.  So, one of two negative results accrue: (1) the new technology is grudgingly accepted, but nobody is allowed to advance anywhere past the primary features, due to a perceived continuous downturn in productivity; (2) the new technology is repackaged, sent packing, and you’re back to using the three-versions-old tool everybody is already (un)comfortable with.  (Right, Ethel?)  In the rare instance of a positive result, that looks more like the deciding lawyer being essentially pleased with the intuitiveness of the base features, and latching onto (roughly) 4% of the usefulness of this new program.  In many cases, lawyers give short shrift to the vetting process for new technology, and accept the endeavor as a win or a loss after a day’s (or less) review.

What that means is that, in large part, attorneys DTR with their technology over a very short course of time; but, even Kristoff knows that just ain’t right.

So, the next time you test a new technology in your office, give it a fighting chance; try these tips:

Play in the Sandbox.  Instead of going through the motions of a generic software test, try taking the time to build a ‘sandbox version’ of the software you’re vetting.  Set up some of the actual features that you would utilize in your practice.  Test out defaults, but also attempt some customization.  Imagine how the software could improve on what you do, and then see if it will conform to that notion.  Get as close to a real world scenario as you can in your testing protocol, and you’ll have a much better idea of whether the software you think you need is the software you actually need.

In Time.  If you’re going to set aside some time to demo a product, why not just set aside all the time?  Most providers will offer a 30-day trial, which you should take full advantage of.  They say nothing’s free; but, your 30-day free trial is; therefore, it makes sense to make full use of that time.  If the calculus is that you may be choosing a software program that you’re going to use for the remainder of your law practice career, invest your time on the front-end, to reap the benefits in the back office.

Staff Up.  It’s essential to get buy-in from your staff, when it comes to applying new technology, because, if you don’t, they can easily nullify your choice.  Not only is it difficult to get out of your own head when you’re vetting a product on your own, neither does it offer a realistic picture of how you would use it in collaboration with your colleagues — which is how you will use it, in actuality.  So, get your staff involved in the vetting process; and . . .

Formalistic.  . . . Acquire their impressions.  If you’re feeling particularly salty, you can develop a question-and-answer submission form to acquire their impressions, comments, concerns, etc.  A Google Form(s) should do just fine, or you could use a free or enterprise survey tool, like Survey Monkey.

All-Star BandAn invested staff is a happy staff, sure; but, more than that, an invested staff can become a teaching staff.  Don’t want to pay for a support package?  Involve your staff in vetting your new programs and features, pique their interest, pick your all-star users, and encourage them to be informal trainers, who can lead others to the promised land of higher percentage uses of program features.

Syncopation.  One of our operating themes in this post has been: don’t do this alone.  Don’t review your potential software fits in a vacuum.  That premise, however, extends beyond peopling your review process.  Consider that you’ll never operate a software application to the exclusion of others.  It’s 2016: you’re syncing with something, even if it’s only your email calendar.  Therefore, when you’re testing software, it makes sense to also test its connections with your existing software platform, to make sure everybody plays nice.

If you invest the time to produce an in-depth software review, your ultimate use of that software you reviewed will pay back compound returns over the life of its application.

. . .

Liner Notes

Seasons of Wither’ by Aerosmith

Jessica’s a big Aerosmith fan, as you may know.  So’s Cutter.  Jessica, though, does not like ‘Seasons of Wither’, believe it or not.  I have to say, however, that I am firmly in the Jeff Dino camp on this question.