Personal Cybersecurity Published

Last week, Apress released my latest book, Personal Cybersecurity.

Personal Cybersecurity is available directly from Apress and on Amazon.

A lot of people helped me with this book, many of whom I mentioned in the Acknowledgements, but there is a large group whom I did not mention. These are the people who attended the talks I gave at the Ferndale Public Library last winter. They have helped get me a real sense of what non-IT professionals need to know and how computing must be explained in order for someone without a professional background to understand the issues. They all get a big thanks from me.

With the help of my audiences, I hope I succeeded in writing a book that has enough technical depth that folks can understand the issues and make intelligent decisions rather than follow a set of rules by rote.

For those who are interested, I am giving the same series of talks at the Lynden Public Library. The remaining two talks are at 1:00p on Saturday January 28, 2017, and Saturday February 4.

Personal Cybersecurity: How To Create a Local Account

I am turning in a new direction in this blog. Up to now, I have been writing to software architects and engineers. Apress has given me a contract to write a book called Personal Cybersecurity and I’ve just given a series of presentations on personal computer security at my local library. My previous two books, Cloud Standards and How Clouds Hold Together IT, are both aimed at enterprise software engineers. I hope Personal Cybersecurity will appeal to computer users in general, not just engineers. I gave the library presentations to see what happens when I talk to people who don’t speak engineer’s jargon. When I returned to my office after the last presentation, I was fretting about all the important things that I left out. For the next few blogs, I’ll write for the folks who attended my presentation–computer users who know how to use their devices but are not software engineers or developers. I’ll try to fill in some gaps in the presentation, and perhaps go farther.

For people who did not attend the presentations, but would like to read the PowerPoints, here are One,  Two, and Three.

Create a local admin account tutorial

The first gap I want to fill is a step-by-step tutorial for setting up a local admin account on Windows 10. In the presentation, I warned that running as administrator all the time can make a hacker’s life easier by freely offering them administrator privileges. I forgot to mention that creating a local account in Windows 10 is like thrashing through a maze blindfolded with rocks in your shoes. The twists and turns are hard to follow. Here is a link to a PowerPoint tutorial that shows every step in Windows 10. Earlier Windows OSs make it less convoluted, but the steps are roughly the same. See the tutorial here.

Coming Soon

I am wincing over the meager advice I gave on detecting when a device has been hacked. It will take more than one blog to compensate. In the next blog, I plan to write about one of my favorite tools for spotting a hack beyond anti-virus: The Windows task list. I’ll try to keep the discussion simple, but the task list is an advanced and powerful tool and it does take some understanding of how computers work. It may take some extra effort to understand, but I think it will be worth it. Understanding the task list can help with more than just hacks.

But that is for next time.

How is OVF different?

The number of standards, open source implementations, and de factor standards for cloud management is growing. CIMI, OCCI, TOSCA, Open Stack, Open Cloud, Contrail. OVF (Open Virtual Format) is a slightly older standard that plays a unique role, but architects and engineers often misunderstand its role.

Packaging Format

What is OVF and how is it useful? OVF is a virtual system packaging standard. It’s easy to get confused on exactly what that means. A packaging standard is not a management standard and it is not a software stack. The OVF standard tells you how to put together a set of files that clearly and unambiguously define a virtual system. An OVF package usually has system images of the virtual machines that will make up the virtual machines in the package. A package also contains a descriptor that describes virtual configurations that will support the images in the packages and how the entire system should be networked and distributed for reliability and performance. Finally, there are security manifests that make it hard for the bad guys to create an unauthorized version of the package that is not exactly what the original author intended.

A packaging format is not designed to manage the systems it deploys. To expect it to do that would be like trying to manage applications deployed on a Windows platform from the Control Panel “Uninstall A Program” page. There are other interfaces for managing applications. The OVF standard also does not specify the software that does the installation. Instead it, is designed as a format to be used by many virtualization platforms.

Interoperability

OVF is closely tied to the virtualization platforms on which OVF packages are deployed. The DMTF OVF working group has members from most significant hypervisor vendors and many significant users of virtualized environments. Consequently, an OVF package can be written for almost any virtualization platform following the same OVF standard. The standard is not limited to the common features shared between the platform vendors. If that were the goal, OVF packages would be inter-operable, that is a single OVF package would run equally well on many different platforms, but because platform features vary widely, the package would of necessity be limited to the lowest common denominator. In fact, OVF packages can be written that are interoperable, but they seldom are because most OVF users want to exploit the unique features of the platform they are using.

An OVF Use Case

Is an OVF package that is not interoperable among platforms useful? Of course it is! Let’s look at a use case.

Here’s an easy one. I have a simple physical system, let’s say a LAMP stack (Linux, Apache, MySQL, and Perl, Python, or PhP) that requires at least two servers (one for an http server, the other for a database).  I this configuration use over and over again in QA. If I want to go virtual, an OVF package for this LAMP stack implementation is simple. After the package is written, instead of redeploying the system piece by piece each time I need a new instance, I hand the OVF package to my virtualization platform and it deploys the system as specified, exactly the same, every time. This saves me time and equipment as I perform tests every few weeks that need a basic LAMP stack. When the test is over, I remove the LAMP stack implementation and use the physical resources for other tests. Then I deploy my LAMP stack package again the next time I need it.

Of course, I could do all this with a script, but in most similar environments, formats and tools have supplanted scripts. Look at Linux application deployments. I remember writing shell scripts for deploying complex applications, but now most applications use standard tools and formats like installation tools and .deb files on Debian. .deb files correspond fairly closely to OVF packages. On Windows, .bat files have been replaced with tools like InstallShield or Windows Installer. Why? Because scripts are tricky to write and hard to understand and extend.

OVF provides similar advantages. With an OVF package, authors don’t have to untangle idiosyncratic logic to understand and extend packages. They don’t have to invent new spokes for a script’s wheel and they can exchange packages with other authors written in a common language.

Interoperability between virtualization platforms would be nice, but I still get great benefits without it.

An International Standard

It is also important to realize that OVF has significant visibility as an international and international standard. OVF has been through a rigorous review by national standards body (ANSI) and an international standards body (ISO/IEC). You may ask if that is significant. Who cares? After all, don’t hot products depend on cool designs and code, not stuffy standards? Maybe. The next hot product may not depend on OVF, but I’ll guarantee that if not today, sometime in the near future, you will interact with some service that is more reliable and performs better because it has an OVF package or two in the background. Hot prototypes may not depend on OVF, but solid production services depend on reliable and consistent components, which is exactly what OVF is for.

Every builder of IT services that run on a cloud or in a virtual environment should know what OVF can do for them and consider using it. They should look at OVF because it is a stable standard, accepted in the international standards community, not just the body that published it (DMTF), and, most of all, because it will make their products better and easier to write and maintain.

The Death of Microsoft?

Is Steve Ballmer leaving?  Is Microsoft about to roll over and die? Are they already buried? Is Windows 8 an abject failure? The Surface a fiasco? Bing a joke? Are PCs and Windows obsolete?

I have no idea about the politics, pressures, or whims that made Ballmer leave. I am curious about his successor, but I guess we’ll all find out soon enough.

Death of the PC

The death of the PC is an exaggeration. Tablets and phones are popular and replace the PC for many people, but there still are content producers who want a fat keyboard and a couple of big displays. The PC market will undoubtedly continue to shrink, but it won’t disappear. One reason for the shrinkage is seldom mentioned: PC hardware is ahead of software. Except for gamers, last year’s laptop doesn’t cry out for replacement anymore. Have you noticed that W8 generally performs better than W7 on the same hardware? A new Windows release used to be Intel’s best sales rep. This round you get improved performance without a new box.

Microsoft Software

Microsoft just might have the right idea with W8 combining touchscreen with a traditional Windows style. After getting used to it, W8 is not so bad; I find it easy to overlook the clunky tiles to desktop transition, and I don’t miss the run menu. Microsoft has a long history of polishing up rough versions. Does anyone but me remember Word for DOS? Improvements in software interfaces are always painful. It is close to impossible to get everything right on the first try and true improvements look like bugs until you get used to them.

Although Microsoft has had some product failures in the last few years, they have also had some real successes. Office has improved enormously. Word documents with complex formats don’t do strange things nearly as often as they used to. I almost never edit with the format marks turned on, something I used to do by default to untangle the confused formatting. Outlook has largely quit screwing up my appointments. OneNote is a product that I initially passed on because I thought it was a toy. But after I started using it, I have it open all the time. I believe it is the best new product I have ever seen from Microsoft. These are improvements I respect.

Redmond Culture

Take this for what it is worth. Your mileage may vary. I’ve lived in and close to Redmond for many years, but I have never worked for Microsoft, though I have met with Microsoft engineers many times. I think something good has happened in the Microsoft culture. I used to avoid Microsoft people, a hard thing to do in Redmond, because they were just too full of Microsoft, but in the last decade, that has changed. Their culture has lost its arrogant edge and its professionalism has gone up. For some reason, I attribute the solid usability of the latest rounds of Microsoft software to that change.

Microsoft is in a tough spot. The PC and Windows market is changing fast and they will have to work hard to maintain their place. But I also think that with the right leadership they will be better prepared to face the head winds than they ever have been before.