Passwords and Accounts

I’m beginning to be overwhelmed.

A few weeks ago, I lost a USB-key (or flash-drive) with a copy of my master Firefox profile on it. The master profile has all the passwords on it. Think about that for a minute. ALL THE PASSWORDS. In one place.

Ouch.

After a rather frantic day changing the passwords on 227 different accounts, and struggling with a new password regimen, it became clear: I need a way to manage passwords. I probably also need better passwords, or at least more of them.

In the process, I also found which sites had rather poor password policies, and I’ve made a list of places to re-assess. In this day and age,  password policies of “all numeric” or “only eight characters” or “upper-lower case only no numbers” are absurd. I’ve already decided to change vendors in some instances, due to absurdist password policies.

I still have to figure how to manage the passwords. There are several commercial solutions, as well as some open-source, but they almost all suffer from one or more drawbacks. I guess I’ll end up making a compromise, somewhere.

The first problem is with the hardware solutions – you have to carry it around with you, it needs batteries, it only stores a small group of passwords, what if I lose it? I don’t think I’m going to use a dedicated hardware unit.

The software solutions, well, I think I’ll have to go with one of them, but for an alternate path, I’m also beginning to use OpenID. I have accounts on several of the providers, but after having poked around a bit, I think I’ll end up using the Google-based provider most often. In order for this to work, of course, you have to have a Google Profile – and thus a new webpage was birthed.

Along the way I’m also going to finally take the plunge into the smartphone pool – StupidPhone™ is starting to wear out, and it’s about time I stepped forward from the trailing edge of technology. Whichever password manager I pick needs to run on an Android-based phone.

Advertisements

Growing tired of Facebook…

I think I’m about to reach the end of the line on Facebook… not totally, I’ll keep a few people on the list, but I’m realizing it is:

1) a colossal waste of time; 2) riddled with bugs and viruses; and 3) not a particularly viable medium for discourse.

This will get updated some over the next few days, but I’m about to trim the facebook “friends” list from its current 145 to perhaps a third of that number. Among other things, facebook is reminding me why I haven’t bothered to return to Burlington NC for well over 20 years.

 

 

Would the Internet exist without US Government sponsorship?

Yet another post based on the muse of Facebook…

I gotta ask, how subsidized is the “internet”? Would this thing be able to operate on a free-market, in your opinion(I would assume yes, as it has massive profits available to it), but, could the start up of the internet, been possible without subsidization? Not sure how clear my question is.  (a Facebook Friend)

My response:

As it is right now, there are no subsidies involved… it’s self-supporting based on domain registration fees and general good-will of the various commercial suppliers involved. To the extent the US Govt is involved at present, it is as a major consumer of bandwidth, and as a content supplier.

Starting out… The Internet (TCP/IP) protocol suite displaced X.25, which was available commercially from the late 1970s (I had an account from May 1978 onwards via Tymnet). X.25 is based on virtual circuits and is closer in conception to telephone switching than to the current Internet.

In X.25 networks, you connected to a single destination, and relied on that destination to provide your content and services. This was the original function of services such as CompuServe, Delphi, Prodigy and AOL. By 1993 the X.25-based services were handling around 20 million subscribers compared to TCP/IP having perhaps 500,000 users. It’s for this reason Windows 95 did not handle TCP/IP very gracefully; there was a good business argument to be made against the whole Internet “fad.”

In ’93 or ’94 the US Govt started to transition out of running the “Internet” – and opened it up to commercial users. Since TCP/IP ran on damn near anything (X.25 required special switches and lots of infrastructure by comparison) and had no messy royalties and such, it began to catch on quite quickly.

On bringing light to the darkness…

Yet another post inspired by Facebook discussion. I think I’m beginning to find my muse…

The web is not quite 20 years old (Dec ’91 was when the concept was published). While for most people the Internet revolves around Internet Exploder or Firefox or Safari, there were other products.

In the beginning, there was Mosaic. And it brought light from the darkness, but it was featureless. It begat Netscape, which had features, but crashed a lot, and eventually was bought by AOL who set about to kill it. Marc A set people free by leading a small band through the wilderness to start Phoenix, but they ran afoul of trademark and thus begat Firefox.

Somewhere along this path, Bill Gate-us of Borg beheld Mosaic, and begat Internet Explorer, which being of parentage foul became the source of much pain and suffering and in derision it is named Exploder.

Thus ends the quick genealogy lesson.


…and if the foregoing largely makes no sense, then here is the deal:

You access the Web by way of specialized software, the “Browser.” It allows you to browse content, in much the same way people [used to?] browse the shelves at the library, looking for something interesting to pull down and read.

NCSA Mosaic, Netscape, Firefox, IE, Safari, etc. are all examples of the “graphical browser.” This is the interface almost everyone uses, and for many people, this is “the Internet.”

It’s only a part of the Internet. There are also text-based browsers; Lynx is the most prevalent of these. Why would anyone use a non-graphical browser? Suppose you’re blind, but still want to make use of the Internet. You don’t need to download the pictures (can’t see them anyway). Or, suppose you have limited bandwidth, but need to get some information. One regular reader of this blog always makes disparaging remarks about the US National Weather Service relying on UPPERCASE TEXT FOR ALL WX MESSAGES – but there are both international treaties as well as good solid engineering reasons for having the all-caps text. [Technical rationale – all-caps can be transmitted as 6-bit code, thus saving 25% of bandwidth; while there is a single source for most WX data there are multiple output streams, at least some of which still use Baudot encoding.]

Back to the story… Browsers convert the user’s simple “woodallrvcc.wordpress.com” to the several lines of commands necessary for the webserver (the other end of the conversation) to find the content the user desires; then the browser interprets the content received and displays it…

It’s important to remember the browser does not represent the entirety of the Internet, and also that browsers are not a one-size-fits-all — except for the moment on so-called “smart phones.”So if you’re looking for that extra edge, try another browser.

Thunderbolt and Light Peak

On February 24 2011 Apple (with an assist from Intel) attempted to change the world. Again.

Fizzy fizzy (fizzle?) – to be expected when you make Lemonade from a lemon.

That’s my quick take on Thunderbolt – it’s an attempt to make lemonade from a lemon.

Here’s the picture: Apple fell behind on peripheral connections. This is the attempt to leapfrog over everyone’s head and come out with something all shiny and new. Apple was first to use Firewire, first to have USB only notebooks, and then they stagnated. They made a couple of updates, adopting USB 2.0, changing to Firewire 800, but they ignored eSATA and USB 3. Their notebooks have always been somewhat crippled by a lack of external ports (I love seeing the big bags of holding carried by serious Apple users which contain the USB hub and cables and external drives and ephemera considered ‘necessary’).

The world marches on, and Apple belatedly realized they needed a new external peripheral bus. Thus Thunderbolt.

Except what they picked isn’t all that shiny, or new, and might even be regarded as a bit of a flop. LightPeak is Intel’s next-generation peripheral bus; based on optical fiber it promised multi-gigabit throughput and tons of interconnectivity. Thunderbolt has the 10GBit/sec throughput… but on copper wire. You have to believe something went a bit wrong between the lab and the showroom.

There’s some buzz generated by the incorporation of DisplayPort technology into Thunderbolt. How this plays out is still up in the air, but it does bring one thing to my mind: DRM. That’s right, along with Intel’s next-generation processors which include DRM on-chip, now the peripheral bus will also have Rights Management. Videographers might want to re-read the fine print on the H.264 licensing agreements… and contemplate. Of interest also is the sole-sourcing for Thunderbolt controllers (Intel) and the de-facto imposition of a royalty on implementation. It’s this last which effectively destroyed Firewire in the marketplace. Apple can be a slow learner at times.

Thunderbolt/LightPeak allows for seven devices, daisy-chained (one after another), with DisplayPort at the end of the chain. Theoretically there is 10W of power for peripheral devices – watch those batteries drain! Thus there is more peripheral power available, but far fewer devices can be attached. Right now that’s not a problem as only LaCie has Thunderbolt product in the channel. Only a handful of peripheral manufacturers have so far climbed onto this bandwagon.

One other thing I see from reading the specifications – Thunderbolt allows direct memory access to main system memory (this system operates peer-to-peer just like Firewire) and thus may well have the security hazards of Firewire as well – do you really trust that projector you attached?

Support your local computer store…

These are the “hole-in-the-wall” operations, working out of a little storefront in the old downtown, where the rents are a bit cheaper. These are the outfits with old recycled display cases, a few bedraggled items in the display window, the piles of stuff all over the interior.

In short, these are the experts. They’ve seen more different problems in a week than most corporate help desk techs will see in a career. They’re creative, intense, eccentric and generally dedicated to fixing the problem.

Best Buy, Staples, Office Max, the other big chains… well, they run a “repair” service. The usual aim is to sell you stuff; the repair shop at a big-box store is there to get you in the door, or to sell you a “service.” The local shop knows the dirty secrets: 1) It’s usually software that is the problem; and 2) Fixing it will take several hours of time – or maybe even a couple of days.

Hardware has become much more reliable – so much so that all the big stores carry for repair stock are the upgrade parts. The local guy is probably building-to-order so he may well have all the parts. However the local store is much less likely to make things worse – such as turning a simple CD-ROM drive swap-out into a replace-the-motherboard debacle (after seeing this trick three times in a year, it’s a ‘feature’ at one chain!).

The local shop will typically have a back room full of racks of desktop machines, all working their way through various parts of the repair-tuneup cycle (one local shop has a 32-point checklist), all tied in to a couple of overworked keyboard/video/mouse switches and a tired server or three for drivers and system reloads. Any wall space has long since been papered over with checklists and procedures, and the only clear spot on a desk is reserved for the coffee cup.

A local shop will also be happy to build a computer to your specifications – not to what a manufacturer or chain-store buyer thinks sounds good. I’m typing this entry on a custom-made system; it has a nice quad processor, lots of RAM, prodigious disk space, Firewire, RAID, big tower case with lots of cooling… and a basic video card. It’s just what I wanted.

Be a computer locavore.

A short discourse on Anti-Virus products…

A lot of the blog posts start as discussions on Facebook and then grow into something I think may have general audience appeal. This post started that way, as a discussion on cleaning a system after infection, and now has evolved into a discussion/review of anti-virus products.

My baseline security is Windows Firewall combined with MSE; also with the NoScript and AdBlocker filters in Firefox, and just AdBlocker in Chrome. I only use IE for the RVCC site and some Microsoft corporate contacts. Once you get a basic install up and running you should image that, and use it as the baseline for future restores. I run full-system images on a monthly basis with snapshots on some of the mission-critical stuff (some directories are on every-4-hour replication!).

Free AV = generally loss-leader advertising for the full product; MSE is the only real-time freeware AV I know of which does not function as beggarware. Clam-AV doesn’t make the grade as it is a batch scanner only.

MSE = Microsoft Security Essentials = free (as in beer) basic antivirus protection; low impact on host system; does its job quietly without a lot of fuss.

Clam-AV = batch-mode anti-virus scanner; no real-time component; thus it is useful as a recovery tool or on an email or webserver to test uploaded items.

The rest of these comments reflect my experience with the paid versions of AV products. I have or have had paid subscriptions with all of the following:

AVG = big, bloated, really trying to outdo Norton and McAfee for useless but cute features, and growing more expensive all the time.

Avast = cute but often behind the times on virus definitions; starts begging for renewal at the 40% mark; gets hyper over normal traffic when using the firewall product.

Norton = McAfee = Trend Micro = bloatware. The fact all of these require a special “removal tool” should suffice as a warning not to get these products.

Kaspersky = when it works, it works well. When it works. And therein lies the problem and the reason I can’t recommend it.

Zone Alarm = when it first came out, this was a decent product. BUT the company had no real business plan and eventually sold out to Checkpoint, who turned it into the hyper-active scare-ware typical of most consumer firewalls. IF properly configured it works well, at least until it is broken by the next update.