Mindset Monday: Practice Makes Perfect, or At Least Better. Part 1 of 2.

Earlier last week I opened a computer program I hadn’t used in a while. Even though it was a program I’d used frequently in the past, it took me a few minutes to get my bearings. I had to look through menus and find where the menu options and commands I wanted to use were located.

Fortunately, I was working by myself and had the time to rediscover where everything was located. Every program has a logic to how the menus are organized and how actions are named. I had time to remind myself of how all that worked.

But what if I had been asked to demonstrate this program for someone else. What if I had been asked to teach someone else how to use this program?

I definitely would need some time to practice.

It is not unusual to practice a skill.

It is not unusual for myself or anyone else, even though I know many people who expect themselves and everyone else they work with to load into personal memory the use of a program as quickly as that program loads into computer memory.

I believe this is a relatively new attitude. I recently read a book about couture sewing, which is very high-end and expensive sewing, usually done by hand. And the recommendation in that book was to practice on a piece of scrap fabric before working on the actual garment. It’s quite common for crochet and knit patterns to recommend swatching to practice the pattern with the yarn being used.

It’s not unusual in many areas of life for practice to be recommended, or even mandated. For high profile jobs in technology, classes and books will often recommend practicing before performing in front of crowds or clients. It is usually people who use technology only in passing who expect that no practice and no reminders are needed.

Monday Mindset: Help and hindrance, standards

At one time I read product standards as a full-time job. I left that job years ago but I still look at what standards a product says they comply with, or are expected to comply with.

Simplistic description of standards.

Many standards do serve a useful purpose: they set expectations for a product. Depending on the standard and who issued the standard, those expectations might cover safety, features, performance, reliability, or other things.

Some standards are free, some cost a bit to purchase, some cost hundreds of dollars to purchase. Some are fairly straightforward to read, some are very dense. The trickiest seem straightforward when reading them, except there are certain terms which have a specific meaning in the industry or market covered by that standard, and that meaning isn’t well known to people outside that industry or market.

Standards can become a hindrance when the market expects or insists a product has to meet a certain standard. A person might have a good product idea but find themselves in an industry or market where the required standard is very expensive to buy or very expensive to comply with.

Standards are by definition reactive and a reflection of the past. Standards describe what has already been made and how it should be made going forward. I don’t know of any standard which was written about an imaginary product, in the hopes someone would read the standard and create a product to meet that expectation.

Standards are a really good way to show the limitations of language in describing the world.

Standards are initially written with an ideal something-or-other in mind. As time goes by, there are revisions which are almost organic in growth. These revisions usually come from someone trying something which didn’t work, or didn’t work as expected.

If a standard is written very precisely and explicitly, it’s easy for someone to avoid if they want to: find a way to describe their product which is different than that precise definition. Then the standard doesn’t apply. And if the definition is written more broadly, then someone who wants to avoid it can argue about the meaning of the words or the intent of the writers. And the standard still might not apply.

Any product or facility which was built or designed more than five years ago, and is being held to a standard whose initial edition was written more than five years ago, will have at least one place where the language or practices have shifted and it’s possible someone could claim the standard possibly wasn’t being met.

The best way to I found to learn a standard is to write a summary of each clause. That’s also very painful and arduous.

Why am I talking about all of this?

I don’t get to turn my brain off because somewhere a product standard got mentioned. I don’t get to turn my brain off because a product says they comply with a certain standard. And I don’t get to turn my brain off because a product doesn’t say it complies with a certain standard.

Standards can be helpful. Like any other tool, they can also be a hindrance.

Technician Tuesday: Some of the measurements for light

The human eye has a huge dynamic range for the amount of light our eyes can use. We can read in bright sunlight and most of us can read by candlelight. Yet the difference in the amount of light from those two source is almost ludicrous.

Trying to decide what light source to use by looking at specifications, instead of repeatedly buying-and-trying, is also almost ludicrous. There are multiple measurements used. The measurements don’t always measure the same things.

The best description I’ve seen recently is in the article “Lumens, Candela, & Lux” by Richard Nance, in the January 2023 issue of Guns & Ammo magazine.

(No, I still have not looked up how I should be citing my sources.)

For the terms lumens, candela, and lux,

  • lumens is how much light comes from a source,
  • candela is the intensity of the light in a chosen direction, and
  • lux is the amount of light on a surface when it’s a specific distance from the light source.

If I’m buying a light for my desk or nightstand, I’ll probably start looking for a lux specification. For portable lights like flashlights or worklights, I’ll look at both lumens and candela to see how much light there (theoretically) is and how focused that light should be. A small focused beam of light is great when I want to see tiny detail. A wider beam is better if I want a more panoramic view, such as if the power goes out and I want to see the entire room well enough to not run into the nearest table.

Light temperature is an entirely different topic.

And then . . . ugh. The article mentioned there was a flashlight standard, ANSI NEMA FL 1. I thought that would interesting to look at. And it was interesting, in a way that’s mildly frustrating and not at all what I expected. I’ll make that a part 2. There might be a part 3 where I post some good links I found about the flashlight standard which don’t cost at least five hundred dollars to view.

Mindset Monday: Know what you want to achieve before you start asking for assistance.

I often talk to people who say “Hey, you’re good at computers, how do I . . .”

Over decades of experience, I’ve learned to ask why they want to do whatever-it-is they are asking about.

Frequently the answer to “Why?” will reveal a belief that computers and digital technology are magic. They must be magic, because the questioner has no idea how it works, but the technology can do all these amazing things. Someone who can work with one part of the magic must be able to use any of the magic, right? The answer is “No.”

Usually the assumption isn’t exposed and challenged until it’s too late. A friend, co-worker, or relative was begged, cajoled, or drafted into helping with one technological project because they’re comfortable with a different type of technology. No set goal was specified. The person asking didn’t know what was reasonable to ask. The person trying to help wasn’t sure what was possible or how much effort and pain any of this would require, especially if it was a type of technology they weren’t familiar with. It ended with everyone being vaguely frustrated.

This is why ComputerGear has a t-shirt for sale which says “I’m a {Programmer}. I write <code>. I don’t fix computers.” and used to have a t-shirt which said “No, I won’t fix your computer.”

(Yes, I still need to look up what is the correct citation format for citing websites.)

Mindset Monday: The tool is not the skill.

Way back at the beginning of this blog, I wrote about the importance of knowing what I want to achieve when I start working with a piece of technology. That post was about the importance of knowing my goal and motive.

My post today is the importance of not confusing the tool with the skill. There are lots of drawing and art software programs available, but none of them make me a good artist when I buy them. There are lots of software programs for music and sound available, but none of them make me a good musician, composer, or sound technician just because I bought them.

Becoming good at a skill takes a lot of work. It takes practice, and research, and looking at other examples in that same field, and more practice, and more research. It’s a slow process. I have to put in the work. I can’t trade money for the software program or electronic gizmo or whatever and have that also be a trade of money for time and effort. The tool is not the skill.

Mindset Monday: What is old might be new all over again.

Every skill has multiple levels to it. I learned to write in school and did fairly well with school assignments. Learning to write emails while I was working in office jobs required learning new things about a skill I thought I knew. I’ve written in different ways for different reasons since then, each time I learned something new about a skill I learned a long time ago.

Technology is like that in two ways.

Being comfortable with technology, learning new technology, deciding what I will and won’t trust technology to do is one skill. And it’s a skill that has new levels every few years.

Learning how to use a particular piece of technology is also a skill in which I find new layers every few years. This last week it was a camera. I’ve enjoyed taking pictures for over 30 years. I’ve used this particular camera for well over a year, maybe two or three years, and have taken many photos with it. And I still found something new to research and try.

Mindset Monday: The digital world is not the real world. The real world is more complicated and more unpredictable.

I read blogs and newsletters about technology. I vaguely noticed most of what I was reading was about software more than hardware. I honestly didn’t think much about it.

Then I started wondering why there is so much more focus on software.

A few things happened.

I talked to a younger friend who had just changed careers. Her earlier career had been very computer- and software-intensive. I encouraged her to find something she was interested in and start reading about it. I told her “I’m glad we have spreadsheet programs instead of the old hand-cranked adding machines my grandmother let me play with as a kid. But the digital world is not the real world. It’s an abstraction of an abstraction of an abstraction of a specific use case of a finicky and non-intuitive way of manipulating natural forces.(1)” I also told her that anything in the real world she chooses to read about will reference other areas. Sociology, anthropology, psychology, history, chemistry, metallurgy, mineralogy, history are all areas I’ve wandered into by reading about something in the real world which interested me.

Earlier this year I read The Pragmatic Programmer – your journey to mastery, 20th Anniversary Edition by Thomas and Hunt. It’s a very good book. I highly recommend it. It is about creating code that one day will have to change. That means making it as easy to change as possible, and as easy to change without breaking everything else. I’m going to explicitly point out this was addressed in the real world long ago. There are very few books about how to build a house so taking out a cupboard in the kitchen doesn’t cause the basement window to no longer open. There are very few books about how sew a shirt so hemming the bottom doesn’t mess up the collar. And there are very few books about designing a car so changing a flat tire doesn’t create a hole in the radiator.

In July I read two articles in The Register about a lack of hardware engineers.

My own opinions:

I know from experience that electrical engineers who design the hardware have to take higher level math classes than the computer scientists who program the software.

That was the case 25 years ago. I’m not sure if it’s still true now, but I expect it is. I’m also not sure about other fields such as computer engineer or software engineer.

I also know from experience that it’s a lot easier to try out new ideas in software than hardware.

A new program can be written, tried out, and erased with the only loss being a little bit of electricity and some time on the part of the programmer. A hardware circuit, no matter how well it works or doesn’t work, still leaves the hardware after the project is done. The hardware has to be either disassembled so it can be used in something else, or completely scrapped. A component soldered to a circuit is not reclaimed with the push of a button the way computer memory is when a file is deleted.

And I know from experience that the real world is far more humbling than the digital world.

I can try to write while tired, mess it all up, have autocorrect fix numerous mistakes and delete a whole bunch of stuff that makes no sense on rereading, and then forget about my mistakes and think I did great job all along. A physical project such as drawing, crocheting, sewing, folding clothes, ironing shirts, or whatever else, is much more obvious when it’s messed up. It takes a lot longer to fix something in the real world. I might have done something unfixable. Even if I redo what I can undo and fix what mistakes I can fix, I’ll remember all that the next few times I look at what I made.

Technology is both software and hardware.

When I say I like technology, or that this blog is about making technology work for the user instead of making the user work for the technology, that is hardware too. It’s not just software.

Why I came up with the long “abstraction of an abstraction . . .” description

“It’s an abstraction of an abstraction of an abstraction of a specific use case of a finicky and non-intuitive way of manipulating natural forces.”

(1) “It’s an abstraction . . .”: Most programmers do not program at a level where they are telling the computer which specific memory cells to use and what specific processor logic commands to use. Most programmers write at a more human-readable and human-understandable level. A compiler turns their code into something the computer can understand.

“. . .of an abstraction . . .”: No matter how amazing it looks or sounds or what it does, all human-readable computer programs are converted to a language or code that tells the processor what to do in language the processor understands. For the processor, there’s inputs; there’s outputs; there’s memory; and there’s commands to the processor to read an input, read memory, do something with what it read from the input or memory, write information to an output, or write information to memory. To the processor it’s all high or low electrical states, called 1s and 0s by humans.

“. . .of an abstraction . . .”: Multiple transistors can be connected, along with some other components, to switch signals, have some logic about whether an output is high or low based on multiple inputs, and hold that high or low state for a time. That’s a very basic description of a processor with memory.

“. . . of a specific use case . . .”: Transistors can be configured to operate as an amplifier, or they can be configured to operate as a switch. For digital circuits, they are configured to operate as a switch.

“. . .of a finicky and non-intuitive way of manipulating natural forces.”: Transistors are made from semiconductor materials. For electricity, most materials either conduct electricity and are called conductors, or they do not conduct electricity and are called insulators. Semiconductors conduct electricity under certain circumstances. Semiconductors are made out of very specialized materials which themselves are not easy or intuitive to make.

The digital world is not the real world.

Mindset Monday: Always be looking.

There’s nothing I do which is new in the history of the world. There’s a lot I do which is new to me. (Or it was new to me at one time.)

There’s always new ideas on how I can do things. Sometimes I find new ideas in unexpected places. Sometimes the new idea is something I was pretty sure I knew and then I find out a much simpler easier way.

The task might be new to me, but there’s someone out who’s done this for years and has tons of experience. I should go find that person, or find something they wrote, and try to learn all I can.

Another way to put this: “I’m completely self-taught!” is often not the bragging point some people think it is.

Mindset Monday: Technology will change. Human nature does not.

Each major change in technology brings about claims that it is a new day for civilization and mankind. “It is different this time.” Both you and I have seen those claims about smartphones and the internet today. The same things were said about the industrial revolution and about the rise in literacy after printing press became more common in Europe. I can probably find similar statements written about every single technological advancement in every single field and industry that exists.

All of those statements were wrong. Human nature did not change. Technology changed, and it changed some parts of the world. Technology did not change human nature.

Things which haven’t changed.

We all have limited amounts of time, energy, and attention. We are all unable to trade those things with each other.

  • I cannot buy an hour of your day so you have 23 hours and I have 25 hours in my day.
  • I cannot sell you my ability focus so today you can focus for 12 hours instead of 3 and I can’t focus at all for the rest of this week.
  • There is no millionaire or billionaire who can say “I really ran myself down last week, can someone sell me some of their energy so I keep working on my product launch this week?”

Technology is still created by humans who are very human.

  • Some inventors and creators will create something because they love it and they are creating it for other people who love it.
  • Then there will be inventors and creators who love getting paid above all.
  • And there will be inventors and creators in between those extremes.

How I choose what I use

How much time, energy and attention do I want to spend on this?

Which options are made for my level of expertise?

What is my goal for using this?

Technician Tuesday: Finding simpler tools. Word processors and photo editing as examples. (2022 Aug 23)

On Monday, I wrote about viewing my time and effort as limited resources. When a piece of technology, hardware or software, starts taking too long to use, I look for alternatives.

The time to learn and use a simple alternative for one task is often shorter than making a more complicated program do that task.

(AutoCAD was infamous for this in the 1990s and 2000s, it could do almost anything if you took the time to figure out how. I’ve never set up ERP systems, or even simpler inventory systems, but I’ve talked many people who spent far more time managing their ERP / inventory / POS systems than they ever spent managing paper records. And Microsoft Excel has its own eSports World Excel Championships, which was broadcast on ESPN2.)

First example: word processors and spreadsheets.

I didn’t use Microsoft Office for several months, a few years back. I logged in to Microsoft Windows with a different email than I’d used to purchase a Microsoft Office subscription. Microsoft was very concerned. I was constantly asked by Microsoft Office if I wanted to change my account. (No, I did not.) I got “Microsoft account problem” warnings from Microsoft Windows. Then I got stuck in the maelstrom of Windows wants Windows Hello, Windows Hello wants facial recognition or a fingerprint sensor, and said the heck with this.

There are a ton of things Microsoft Office can do. I wanted a simpler word processor. So, I downloaded LibreOffice. It installs fast. I don’t get any Microsoft account errors. The program does occasionally crash, so I save often. That’s the only drawback I’ve found. Now, I only buy a Microsoft Office subscription when I need to use a Microsoft Office document or spreadsheet with features only Microsoft Office supports.

Second example: photo resizing and watermarking.

I use a digital camera which creates large files. I’m not going to ask my friends to download huge photo files when they want to look at my pictures. I could use a photo editing program like Photoshop or GIMP to edit each photo, decreasing the file size and adding a watermark with my name and the year. I could learn how to create macros in a photo editing program.

I downloaded AVS4You instead. There are lots of other alternatives, I use AVS4You, use whatever you want. The photo resizing program (technically the image converter) from AVS4You is free to download. I load the photos, set file name modifications, file size modifications, watermark, and which directory for the new files.

No, I don’t get any compensation from any program or company I mention using. You can use whatever you want, I’m using examples of what works for me.