on public key encryption and security

As part of the moving process I got a bank account, and I was reminded, again, of how much the security systems of most online banks are comically flawed, which lead me to even greater anger about security in general. The following rant is what happened.

I should say at first, that I’m not really a security expert, and I just dabble in this stuff. Having said that…

“Security” online and in a digital context covers two pretty distinct aspects:

  • Identity. In real life we can show our drivers license or passport, we can say “I’m [insert name here],” and in many situations another person is probably not too far away to be able to say, “I know them, they’re [insert name here].” Online? Well identity is less easily and reliably verified. Identity is important both for individual’s (and organizations') identity and for things that people (and organizations) produce/own: emails, documents, web pages, software, and so forth.
  • Encryption. Basically we encrypt data so that we can be relatively certain that no one gains access to our data unless, by listening into our network connection, or gaining access to physical media. From encryption we get privacy, and as long as the encryption scheme works as it should and the encryption covers communications end-to-end, it’s pretty safe to assume some measure of privacy.

It turns out, from a technical perspective that encryption is reasonably easy to achieve. It’s true that all cryptographic schemes are ultimately breakable, however, if we can generally assume best practices (expiring old keys, keeping your private keys safe, etc.) then I feel fairly safe in asserting that encryption isn’t the weak part of the security equation.

This leaves identity on the table. Which is sort of a messy affair.

Just because someone says, “Hello my name is Alice,” it doesn’t mean that they are Alice. Just because they have Alice’s password, doesn’t necessarily mean that they are Alice (but that’s a safer bet.) The best, and most reliable way to verify someones identity, it turns out, to have a “web of trust.”

Which basically means, you assert that you are who you say you are, and then “vouch” for other people who you “know” are who they say they are. Once you’ve vouched for someone you then “trust” that the people they’ve vouched for, and so forth. Good web-of-trust systems allow you to revoke trust, and provide some mechanism for propagating trusted networks of identities among users.

The above described system is a very peer-to-peer/ad hoc system (bottom up, if you will), there are also more centralized (top down,) systems which can also function to verify identity in a digital context. These systems depend on commonly trusted third parties that are tasked with researching and verifying the identity of individuals and organization. So called “certificate authorities,” make it possible to “trust identities” of without needing a personal web-of-trust network extend to cover people and organizations you’d come in contact with.


Lets bring this back to the case study of the bank,

They encrypt their traffic, end to end, with SSL (eg. TLS), they pay for a certificate from a certificate authority with a good reputation. The weak part of this equation? You and Me, apparently.

To verify our identity, we have this arcane and convoluted scheme where by we have to enter hard to remember passwords in stages (my last bank, had us enter passwords on three pages in succession) so that the back can be sure we’re who we say we are. And the sad part is that while encryption and identity verification technology in secure and reliable ways is pretty advanced (in the big picture), we still have to send passwords. Here are my thoughts on passwords:

  • The best passwords are the hardest to remember. The best passwords don’t contain words, and contain numbers, letters, and punctuation. But these passwords are difficult to remember, and I think many people avoid picking truly secure passwords because of the difficulty.

  • Passwords aren’t bad, and they’re--I suspect--most useful as a casual deterrent and a reminder to users of the potential gravity of the situation; but they’re not exactly a reliable fingerprinting mechanism.

  • Some sort of cryptographic handshake would be many magnitudes more secure, and much less painless for users.

    I have this theory, that security for banks (and other similar institutions) is more about giving the appearance of being secure (asking for more complex passwords, making you jump through more hoops, etc.) and less about doing things that would be more secure in the long run. But maybe that’s just me.

Anyway, back onto more general interest topics in the near future.

on package management

I was writing my post on distribution habits and change, and I realized that I some elaboration on the concept of package management was probably in order. This is that elaboration.

Most linux--and indeed UNIX, at this point--systems have some kind of package management:

Rather than provide an operating as one-monolithic and unchanging set of files, distributions with package management provide systems with some sort of database, and common binary file format that allows users to install (and install) all software in a clear/standardized/common manner. All software in a Linux system (generally) is thus, covered by these package managers, which also do things like tracking the way that some packages depend on other packages, and making sure that the latest versions of a package are installed.

The issue, is that there are lots of different ways to address the above “problem space,” and a lot of different goals that operating system designers have when designing package management and selecting packages. For instance: how do we integrate programs into the rest of our system? Should we err on the side of the cutting edge, or err on the side of stability? Do we edit software to tailor it to our system/users or provide more faithful copies of “upstream sources”? These are all questions that operating system/distribution/package managers must address in some way, and figuring out how a giving Linux distribution deals with this is, I think, key to figuring out which system is the best for you, though to be fair, it’s an incredibly hard set of questions to answer.

The thing about package management, is that whatever ideologies you choose with regards to what tools you use, what packages to include and how to maintain packages, the following is true: all software should be managed by the package management tools without exception. Otherwise, it becomes frighteningly easy for new versions of software to “break” old non-managed versions of a piece of software with overlapping file names, by overwriting or deleting old files, by loading one version of a program when you mean to load another version, by making it nearly impossible to remove all remnants of an old piece of software, and so forth, or just by making it hard to know when a piece of software needs to be updated to a new version for security fixes or some such.

I hope that helps.

why arch linux rocks

So, long story short, I’ve been working a lot with ArchLinux in the last few days, getting it setup, and starting to use this peculiar little distribution. While I will surely be blogging more about Arch in the coming days, I think a brief list of first impressions are in order.

  1. I share values with the Arch Developers.

    This is, I think, a factor of “choosing a Linux (or BSD) distribution” that is really hard to understand or explain. In part because the values that distinguish distributions are hard sometimes hard to suss out, particularly if you’re on the outside looking in. This explains the phenomena of “distro hopping”

    My sense of the “Arch” philosophy/approach is largerly what this post is about, but in summary: arch is lightweight and minimal, Arch expects users to be sophisticate and intelligent (Arch would rather tell you how something works, so you can do it “right,” than try and save you from yourself and do it in a way that might be wrong.) Arch is a community project, and isn’t reliant on commercial interests, and arch is firmly dedicated to free software ideas.

    How does this compare to other distributions you’ve heard of. Arch is community oriented/originated like slackware and Debian; Arch is lightweight like debian-netinst and Gentoo; Arch is minimal like Damn Small Linux (though not quite that minimal) and the other tiny-Linuxes; Arch is based on binary packages like Debian and Fedora/RedHat/CentOS; Arch uses linux, but takes inspiration from the BSDs in terms of system architecture; Arch uses a rolling release cycle like Debian testing branch and Gentoo.

  2. It doesn’t dumb anything down, and doesn’t expect users to be either experts *or* total beginners.

    I think the term they use is “Intermediate” or “Advanced Beginner” but in any case, but in any case I think the approach is good. Provide configuration in it’s most basic and straightforward form, and rather than try to make the system easier to configure, document and hope that straightforward configuration setup will be easier to manage in the long run than a more convoluted, but “easy” set up.

    Basically Arch expects and assumes that complexity and difficulty are the same, and opposed and that simplicity and ease of use are similarly connected.

  3. Arch values and promotes minimalism.

    This comes from a few different aspects of Arch but in general, the avoidance of complexity in the configuration, and the “blank slate” aspect of the installation process combine to create a system that is minimal and that is almost entirely agnostic with regards to what you might want to do with the system.

    Where as many linux-based systems are designed for specific tasks (eg. mythbuntu; medibuntu; linux mint; crunch linux, etc.) and include software by default that supports this goal. Arch in contrast, install no (or very little) software by default, and can function well for a wide range of potential uses, from the fully featured desktop to the minimalistic headless server install.

  4. The Arch Wiki Rocks.

    I’ve been thinking about wikis and what makes a wiki “work” rather than “not work,” and I’m beginning to think that the ArchLinux Wiki is another example of a wiki that works.

    I used to think that wikis powered by the MediaWiki engine were always bad: they look too much like wikipedia (and are reasonably hard to customize) and as a result people tend to treat them like wikipedia which caries all sorts baggage from the tradition 19th century encyclopedic projects and colonialism, and fails to capture some of the brilliance and effectiveness of wikis outside of the world of wikipedia (and the MediaWiki engine by association.)

    So despite this, the ArchLinux wiki is actually really good and provides helpful instructions for nearly everything to do with Arch. It looks good, and the more I read it all of the cool discursive/opinion-based modality that I enjoy the most about wikis is present on the Arch Wiki.

  5. Archies are really geeky and great, and their interests and tendencies are reflected in the packages provided by the system:

Allow me to justify this with a few anecdotes:

  • Arch includes a “snapshot package” from emacs-23 in the main repository (you have to add another debian repository to get this in debian).
  • There is a great cross over between Awesome--my window manager of nchoice--and Arch, so there are good up to date packages of Awesome.
  • Uzbl, (eg. useable) a super minimalistic, web-kit based browser is developed on/for Arch.
  • As I was getting my first virtual machine setup, I did a bit of distro hopping to see what would work best. I decided to use virtualbox (because it’s nearly free software, and reasonably full featured) and I had a hell of a time getting other OSs to work right inside of the virtual box, but it appears that other Archies have had the same thought, and there were pretty good explanations on the wiki and it just worked.

How cool is that? I don’t think arch is for everyone, but, if any of what I’ve talked about today sounds interesting/appealing, give it a shot. Also, my experiences with running it under Virtual Box have been generally favorable, so if that’s more your speed, give it a shot.

Onward and Upward!

distribution habits and change

Here’s another one for the “new workstation series.”

Until now my linux-usage has been very Debian based. It’s good, it’s stable, and the package management system is really intensely wonderful. I was happy. And then I was somewhat less happy. Ubuntu--the distribution that I’d been using on my desktop.

Don’t get me wrong, Ubuntu is great and I’d gladly recommend it to other people, but… with time, I’ve found that the system feels clunky. This is hard to describe, but a growing portion of the software I run isn’t included in the normal Ubuntu repositories, some services are hard to turn off/manage correctly (the display manager, various superfluous gnome-parts,) and I’ve had some ubuntu-related kernel instability.

My server, of course, runs Debian Lenny without incident. There’s something really beautiful about that whole stability thing that Debian does. I considered running Debian (lenny/testing) distributions on my desktops because of the wonderful experience I had Lenny, and so I tried running various flavors of Debian and I either found that for the software I wanted to run things were either too stale or too unstable. This is totally to be expected, as Debian’s singular goal is stability, and getting a fresher/less stable operating system that’s based on Debian is always going to be a difficult proposition.

In the past I’ve dragged my feet with regards to upgrading operating systems because I take a “don’t fix it if it ain’t broke,” approach to maintaining computers, and all my systems worked so, so until I was faced with this work computer--despite my dissatisfaction--I’d never really seriously considered the mechanics of changing distributions, much less the prospect of having to interact with a linux distribution without the pleasures and joys of apt-get.

But then I got this new work station and…

I switched. At least for one system. So far. ArchLinux uses a system called “pacman” for package management. Pacman is really nifty, but it’s different from apt: it’s output is a bit clearer, it’s just as fast and “smart,” and the packages are fresh.

And then there’s the whole business of the way Arch approaches management; Arch uses a ‘rolling release system" where rather than release a version of an operating system, that has a given set of packages at a given moment, Arch packages are released when they’re “ready” on a package-by-package basis (with an awareness toward interactions between packages,) and pacman has a system for easily installing software that isn’t in the main repositories as packages. (Which makes upgrading and removal of said packages later much easier.)

This sounds complex, maybe too complex, but some how, it’s not. When I started thinking about writing this post, I thought, “how do I convey how totally strange and different this is from the Debian way.” By the time I got around to actually writing this post, I’ve settled into a place of stability and I must confess that I don’t notice it very much. It’s wild, and it just works.

I was going to go on this whole schpeal about how even though the functional differences between one “flavor” of GNU/Linux and another are pretty minimal, it’s interesting to see how different the systems can “feel,” in practice. Here’s a brief list of what I’ve noticed:

  • I find the fact that the flags that control operations with pacman to be non-intuitive, and frankly I’m a bit annoyed that they’re case sensitive so that: pacman -S is different from pacman -s which leads to a lot of typos.

  • I’ve yet to set up a machine in Arch that uses wireless. I’m just wary, mostly, of having to set up the network stuff “by hand” in Arch, given how finicky these things can be in general.

  • The ABS (arch build system, for installing packages that aren’t in the main arch repositories,) took some getting used to, I think this is more about learning how to use a new program/tool and the fact that the commands are a bit weird.

    Having said that, I really like the way the package building scripts just work and pull from upstream sources, and even, say, use git to download the source.

  • I’m impressed with how complete the system is. Debian advertises a huge number of packages and prides itself on its completeness (and it is, I’m not quibbling,) but I run into programs where I have a hard time getting the right version, or the software plain old isn’t in the repository. I’ve yet to find something that isn’t in the repository. (Well, getting a version of mutt with the sidebar patch was a bit rough, and I haven’t installed the urlview package yet, but that’s minor.)

    I think this is roughly analogous to the discussions that python/ruby people have with perl people, about actual vs. the advertised worth of the CPAN. (eg. CPAN is great and has a lot of stuff, but it suffers from being unedited and its huge number of modules is more an artifact of time rather than actual value) So that, while Debian (and CPAN) have more “stuff” than their competitors, in many cases the competitors can still succeed with less “stuff,” because their “stuff” has value because it’s more edited and they can choose the 80% of stuff that satisfies 94% of need, rather than having 90% of the stuff that satisfies 98% of need. Diminishing returns and all that.

    It’s lightweight and smooth. While the hardware of my work computer is indeed impressive, particularly by my own standards, the hardware of the “virtual machine” isn’t particularly impressive. And it’s still incredibly peppy. Lightweight for the win.

Next up? More reasons Arch Rocks.

See you later!

Multiple Computers and Singular Systems

Here’s another episode in my “work workstation” series of posts about setting up my new computer for work, and related thoughts on evolving computer setups. First, some history:

My tendency and leading desire is to make my working environment as consistent as possible. For a long time I was a one-laptop, kind of guy. I had a PowerBook, it did everything just the way I wanted it to, and when ever I needed to do something digitally, I had my computer with me, and I didn’t have to worry about other people’s computers being configured wrong. It meant that I worked better/smarter/more effectively, and I was happy.

When the PowerBook died, particularly as “my work” became intertwined with my computer, it became clear that I needed a bit more: a computer I could stretch out on, both in terms of things like media (music/video/etc) and in terms of screen space. Concurrently, I also discovered/became addicted to the Awesome window manager, and this has been a great thing for how I use computers, but the end result of this transition was that I had to manage (and needed to use) a couple of machines on a fairly regular basis.

Basically I have a set of applications and tools that all of my systems have installed on them, either their configurations are all standard or I store a copy of the configuration file in a git repository that I link all of the machines to. My work is all stored in git repositories that I sync between machines as needed. It works pretty well, and it means that aside from hardware constraints its not so much that I have multiple machines, as it is that I have different instances of the same machine.

Next: the implications…

I think above all, I’m a Unix guy. UNIX is a modular system that I would describe as being based on a certain kind of modularity, I’ve also worked out practices for myself that allow me to keep my application configurations synced between machines. Most of the time configurations don’t change, but sometimes they do, and when that happens all I have to do is sync up a git repository.

The second implication is that I set up and work with my systems with some notion of stability. While I must confess that I’m not entirely pleased with the way ubuntu has my system desktop and laptop running, it is stable and reliable, and I’m wary of changing things around for a setup that would be functionally more or less the same, but a bit more parsimonious on the back end. I maybe be a huge geek and a hacker type, but I’m a writer and reader first, and although while I’m blathering on about my setup it might seem like all I do is tweak my systems, the writing and reading are really more “my thing.”

My Workstation Choices

I’ve been talking in fairly abstract terms about this new workstation that I’ve been setting up, and about how this fits into the general ethos of my existing hardware setup, but I think, it’s probably the right time to interject and clarify some of the choices and base-assumptions that I’ve made during this process.

  • My current systems (hardware):
    • A moderately powered dual-monitor workstation (ubuntu-from-Dell, circa October 2008) that’s running Ubuntu for the moment (probably move to arch or Debian in the next few months.) This was my work computer during the freelance period.
    • A thinkpad x41t (vintage 2005); 1.4 GHZ pentium M (?); 1.5 gigs of ram; 60gb hard drive, running ubuntu with the lenny kernel. This is my main personal computer at the moment, as I haven’t gotten the desktop setup yet. It’s a great machine, but I do feel a bit cramped on it for heavy day-to-day useage, it’s great for distraction free writing, and portability.
    • (The work computer) A contemporary vintage iMac running OS X 10.5 latest, and also running Arch Linux in Sun’s VirtualBox system.
    • (The infrastructure) Debian based virtual server(s), to provide my own personal cloud (web hosting, git hosting, file syncing, remote shell access, email).
  • My current systems (software; but application centered):
    • Window Mangement: awesome. I run slim as a display manager on the laptop, and just use startx/xinit on the desktop/virtual box sessions.
    • Email: I use mutt for reading email, compose emails in emacs, sort email using procmail, download email using fetchmail (if neccessary), but mostly keep mail synchronized using my own git-mail scripts. For sending email and smtp connectivity I use msmtp, and I suppose I’m using postfix on the server as well.
    • Text Editing: I use emacs23 (still the CVS/development/snapshot branch) of emacs (stable is v22). I use 23 because I like the emacs-daemon functionality, and it’s pretty damn stable. I have aquamacs installed under OS X for the moment, but I’ll probably install 23 soon, because it’s quirky.
    • Personal Organization: org-mode, which is technically included in emacs (and I use whatever the stock version in 23 is, these days.) I use org-mode for managing my todo lists, shopping lists, project planning and appointments.
    • Shell/Terminal: bash and urxvt(cd) under linux, and terminal.app on Leopard. And GNU Screen. I live in screen.
    • Web Browsing: I use firefox with hit-a-hint, and emacs-key-bindings (firemacs) on linux systems, as I wait for the day when good functional web-kit based browsers begin to become a possibility.
    • IM/IRC/Chat: mcabber for IM (running ejabberd on my server with the pyaimt transport), and irssi for IRC.
    • Linux Distribution: Debian stable on servers; Ubuntu-mostly on desktops with a desire to move to ArchLinux for desktop use. I love debian, but I think for my desktop-use purposes I cant find a setup that I’m comfortable with, and while ubuntu is great (and I’m glad it works so well with my laptop;) it’s a bit heavy and makes assumptions that I’m not comfortable with. Alas.

That’s what I’m working with. Just so you know. The mocking can begin now.

Fa Sol La

I’ve developed a new recreational activity. As if I needed another one.

I did a little shape note singing at the Morris Dance gathering, as I usually do, but this year something clicked. I’m not sure exactly what it was: I’d been singing a lot that weekend and my ears were used to listening to and picking out harmonies, my voice was a bit tired (and thus a more comfortable bass), I was sitting directly behind an incredibly powerful base. Any one of or all of these things coincided to produce a really amazing experience, and one where I was able to feel the music. It was amazing


I should break in and say that I’m not an incredibly musical person, and music/singing isn’t something that I did very much of growing up. I think teaching people (particularly boys) how to listen, how to sing, and how to listen for harmonies is incredibly difficult, and not something--certainly--that I was ever exposed to as a kid.

I played Clarinet in middle school (and was in the 4th grade choir,) and while I was able to do ok, I never developed an instinct for it, I didn’t really ever figure out how to listen.

In high school I started doing the dancing (International folk dance, Morris Dance, Contra/etc.) and that worked for me. I could feel the dance, the beat, the music, and I was able to learn the grace and mechanics after a few months. It was amazing, finally to have away to have “an ecstatic experience of the music.”


A friend of mine from dancing described shape note singing, as “singing for people who don’t dance,” because I think in a lot of ways, shape note singing is more like dancing than it is to other musical forms:

  • Shape not singing is participatory: like folk dance, it’s meant to be done rather than watched. The music is arranged and harmonized in such away that makes it hard to record accurately (the melody is in the middle of the harmonic range rather than on the top,) and it’s sung loudly by large groups of people, and singers arrange themselves facing each other so the closer you are to the middle the better you can hear.
  • There’s a “pulsing” feeling that you can sort of feel in your gut when you’re doing it “right.”
  • The shapes provide a way for people without classical training to understand and participate in singing, in the same way that folk dancers introduce a choreographical short-hand to teach people how to dance without requiring formal training.
  • It’s totally an ecstatic experience, and I’ve never left a signing without a little bit of a “singing buzz”

So this being said, this shape note singing thing is incredibly weird for me. Perhaps the weirdest thing I’ve done to date, which is saying a lot giving the knitting and the Morris dancing.

Shape note music is very definitely in the category of: Spiritual Music from the American Protestant Tradition. The songs are all hymns many signings--the best ones really--have opening/closing/recess prayers and the best places for signings are inevitably churches (high ceilings, limited upholstery).

And here I am, this dweeby Jewish guy, from a family that isn’t (historically/traditionally, on either side) particularly religious (ie. theistic) or observant. My own religious/spiritual views range from: “limited” to “existential/queer,” and it’s not something I loose a lot of sleep/attention over.


I called a song at the last singing, Hallelujah from the ‘91 Denison (Red) book (forget the number at the moment; it’s a popular one). And the leader asked which verses I wanted to sing.

Dude. I haven’t a clue.

The verses are neigh on irrelevant for me. In a strange way, what I think of as the ecstatic experience of the music, the “singing buzz,” is what a lot of people think of as the “spiritual” aspect of a singing. And for me it has more to do with the “space” and the moment and less to do with G-d. But that’s just me.

We sang, at M.N.’s suggestion 1 and 4. But it was a good song, we could have sung ‘em all, and I wouldn’t have cared one bit.


In any case, the one thing I know for sure is that I want to sing more often.

New Workstation Trials

Rather than bore you with minutia of my move (eg. shit, I need to get a shower curtain; why is all my toilet paper on the moving truck; I’m getting really sick of sitting on the floor while I wait for the moving truck) I thought I’d talk a little bit about something that seems to occupy a bunch of my thinking these days: how I’m setting up my work-desktop computer. It’s trivial, an on going process, and of minimal interest to most other people.

So I think I’ll make a series of it. There’s this post, and another that I’ve written, and a few other thoughts that I think fit in. And the truth be told, I’ve been spending so much time recently packing things, attending to chores and dealing with other crap, that it’ll be good to have an excuse to do some writing again.


Here’s the setup: I have an iMac of contemporary vintage for a workstation, which is of course running OS X 10.5 Leopard, by default. Here are the challenges I found myself facing:

  • OS X is a great operating system, but I’m not a mac guy anymore, much to my surprise. I installed quicksilver pretty early on, but the truth is that my head isn’t shaped that way any more. The mouse frustrates me, and all the things that make OS X great I don’t really use.
  • All my other machines and working environments for the past while have been linux based, and I’ve basically come to the conclusion that having one environment that’s shared between a large number of boxes is preferable to having different and unique operating environments.

For a long time, I got the unified operating environment by virtue of only using one portable computer; now I just keep my laptop and desktop the same (more or less) get a very similar result. This is probably worth it’s own post, but I was vary wary of having a work machine that was too radically different from what I’m used to and what I use personally.

So if I need to run Linux, there are really only three options:

  1. Level OS X and just run ubuntu/debian/etc.
  2. Set up a dual boot OS X/Linux system, and switch as I need to.
  3. Run Linux in some sort of virtual machine environment like VM Ware or VirtualBox.

Option one is simple, and I liked the thought of it, but it seems like such a waste and what if there was some mac-specific app that I wanted to try? And it removes the option of using OS X as a back up… While I’m no longer a mac guy, I do think that OS X’s desktop environment is superior to any non-tiling window manager for X11/Linux.

Option two works, certainly, but I hate rebooting, and in a lot of ways option two would be functionally like option one, except I’d have less accessible hard drive space, and I’d have to reboot somewhat frequently if it turned out I actually needed to use both. Best and worst of both worlds.

Option three is frustrating: virtual machines give some sort of minor performance hit, integration between host and guest is difficult and confusing sometimes, and guest operating stability is largely dependent upon host operating system stability.

Ultimately I chose option three, and as I used the machine, the fact that performance suffers a hit is totally unnoticeable, and though it took a bunch of futzing, I think I’ve finally settled into something that shows a lot of promise of working. (Ed. note that I’ve got a bit of a production lag on the blog.)


I think that’s sufficient introduction. There’ll be more, don’t worry.