It's my birthday.
It's a little annoying that Matt (or rather, Interconnected, it being a separate entity) is on autopilot right now, because I'm going to talk a little about his notes on cascading and recombinance and through that, tie together some threads that he's grouped together.
Here's some ideas and observations just to throw out there:
We all know that an object's utility increases pretty much exponentially according to the number of other objects that it can talk to. We know this because of the relationship between the number of edges and nodes we have in a graph (I've spent the whole bloody week thinking about graphs and edges and nodes for an assignment). We also know this because we like thinking about social software and nurturing rich connections amongst communities of people online and appreciate the benefits that inviting large numbers of people into a discussion can bring.
Let's take a view from one level - things that are in our world that aren't people: they're getting smarter and they're getting more interconnected. Of course, they're still incredibly stupid (compared to us), but what we've got now, that we didn't have before was a large array of relatively dumb things (still smarter than before, remember) that are just about starting to talk to each other.
We would have dumb things like, say, VCRs. VCRs don't really talk much, and one of the best innovations in VCRs were when they suddenly gained the ability to listen (they stayed mute, though), and in one fell swoop, every new VCR was spared the ignomy of sitting there with a blinking "12:00" light on its front. VCRs had learned (I'm terribly sorry, there'll be much more anthropomorphisation to come) to listen to the data that was coming in off television signals and set their clock themselves.
Witness the next step in VCRs: personal video recorders. This time, VCRs, or PVRs now, learned to listen slightly more (grab scheduling data over copper wire), but still aren't really talking. Note that PVRs grabbing scheduling data is more of an advance than the early Video+ system (until, of course, those VCRs started listening for broadcast flags that signaled when a particular program had started).
VCRs became less dumb, more useful and easier to use when their communication abilities increased. This is what it means to be part of a network, albeit in this case it was more of a push network, with no real interactivity.
That's great. We should have more objects that can listen, and maybe we should have more objects that can talk, too. Matt's take on Microsoft's Smart Objects idea, though, is that it's not enough to have a self-contained device that takes data in at one end and then squirts out a behaviour on the other. Note that I said the device will squirt out a behaviour: it's not entirely clear what the smart alarm clock will do other than wake you up early or let you know the traffic conditions when you get up.
One of the reasons why people like UNIX-like operating systems is that the have this principle of filters and pipes. With filters, you put something in, some magic happens, and then you get something different out. Pipes are simply a way of standardising how filters accept input and output. What you end up with is a whole bunch of utilities that take an input and produce an output, but they also accept input from pipes, and they can also output to pipes.
Thanks to all of these pipes and filters, it's suddenly a trivial task to string together a whole bunch of filters and feel like you've done something tremendously complicated in practically no time at all.
Small pieces, talking to one another.
Throw in some other parts of logic, some sequencing, repetition and choice and now you have an even more tremendously complicated thing that you've thrown together, without having to understand what it is that's going on underneath: all you need to know is that if you use program foo and give it a URL, it'll spit out a slightly altered one that you can pass to program bar.
This is, coincidentally, why a lot of people are falling over themselves to play with OS X, that bizarre operating system from Apple where not only do you get a yummy GUI (though certain UI elements aren't exactly fixed yet) you do get all the yummy bits of UNIX, namely all these pipes and various bits that you can hook up to each other. That's without going into the goodness that is the wonder Apple have built into the OS called Applescript, which even lets "normal" GUI programs talk to other programs.
You see the same kind of thing happening with object oriented programming, where you model real world problems with objects: objects have attributes (things that describe them) and methods (things that they can do). Take, for example, your car: it's red (an attribute) and it can move (a method). Here's the fun part: the objects talk to each other, because their methods can pass messages around. In fact, you get pretty much nothing done at all if you don't pass messages around, because all your objects will sit passively, not unlike a disinterested rock.
The problem with these filters and these objects is that most of the time, they don't tell you what it is exactly that they can do. There's no real equivalent for them where you send them a message and ask them what they can do where they respond with a nice message and say "Well, if you gave me x, I'd play with it for a while and give you y, how about that?"
This is where Matt chimes in: all these Smart Devices aren't smart enough, they're only half-fulfilling their potential if all they can do is suck in information and display it. If they can suck in information, display it and squirt it back out to whoever wants it, their utility increases manyfold.
In other words, imagine it like this. In a world where all objects (physical or abstract) are online and able to not only receive and send messages but to publicise what they do, creating novel applications becomes less of a chore and more like playing with Lego.
Let's explore this with the alarm clock example again--Microsoft's SPOT Alarm Clock will pull in assorted data off the network and adjust its display. However, it doesn't sound like it'll be able to do anything else (or, if it does, it may well be integrated into some sort of .NET service walled garden). On the other hand, let's try it with the send, receive and publish model:
You take your new clock home. Essentially, it's no more than a display with a simple processor behind it. When you take it home and power it up, it allocates itself a link-local IP address and joins your wireless network. Then, using a protocol like Rendezvous, it announces itself to the other devices in your network and lets them know what it can do. All the other devices on your network, your PVR, your fridge, your light switches, your television, your laptop, your PDA now know about your new alarm clock. At the same time, your alarm clock now knows about all of the above devices. So far, so hunky-dory, but this is where it gets interesting.
Your alarm clock displays the time. Your alarm clock is also connected, through your home's gateway, to the rest of the internet, so grabbing traffic reports is trivial. So far, so Microsoft Alarm Clock. But now, this alarm clock knows how to discover services on your home network so there'll be a button on your alarm clock that will say something like "What else can I do?" and pressing it will bring up a nice menu that will let you all the other activities the other devices in your home make available.
Fair enough, that might not be terribly interesting. But setting your alarm clock via your PVR's interface? Setting your alarm clock via your PDA? All these things become possible when devices are able to send and receive information and also make available what operations they can perform.
This is what Matt means when he's talking about cool things today ending in knots. The beauty about lego is that every bit knows how to connect to other bits. They have clearly defined interfaces (the knobbly bits and the indented bits) and, well, we can see them so we know what they do, so the analogy falls down somewhat there. The point is, though, that we can easily see what they do and how we can fit them together in new ways. There are only a few simple rules, but the combinations that can be produced--and can be produced quickly and easily, because that's the killer--are astounding.
It is an exciting time to be alive. This is a time where a company has finally been able to produce low-cost passive IDs for objects, where low-cost is under ten cents. Cover everything in IDs. Put smart dust everywhere, smart dust that can talk and will tell you what it can do and what it can see.
We started at the object level, examining the devices we use and how they're changing and we've slowly moved upwards: the next level is watching how we as people use those objects and how they help us create richer, more complicated, faster, more informationally dense relationships than before. There is an immense wealth of data out there that's being spontaneously generated sheerly through things being alive and acting, through things moving and interacting. Imagine a time and a place where all the time and every place was like bouncing off someone and coming up with new ideas that you could string together and implement in minutes.
The Guardian has a guide to Thanksgiving:
Today is Thanksgiving, the most beloved of US holidays, so that's why none of your business calls to America are being returned. [more]
Via Metafilter comes information about DARPA's project to create a self-healing minefield. On the one (slightly small, cool from a technological point of view), this is interesting, on the other, well, mines are evil.
It's been pointed out, though, that standard practice is to deploy a field of anti-tank mines (which are, apparently, trivially easy to remove using mounted infantry, seeing as the triggers for anti-tank mines are (supposed to be) insensitive to humans) and then protect that field by another deployment of anti-personnel mines. The self-healing field of anti-tank mines would then require no anti-personnel minds and (in someone's peverse world), would be Better All Around, never mind that some have pointed out that, say, no mines at all would be Better All Around.
This would be all well and good were it not for the surprisingly tasteless shockwave flash demo showing off the technology (where shockwave flash itself seems like a technical term of the project).
Via Slashdot, yet another article about The Sims Online, albeit this time gawping at the possibilites that large online communities can produce:
[As] online games accelerate their journey to the mainstream, the possibilities will multiply. And the great, scary thing is that no prognosticator can guess what the community will think up, any more than Tim Berners-Lee could have anticipated eBay. All we know is that these environments, which can harness the intelligence of millions, are likely to spawn a story as big and as unexpected as eBay. What could you do if you had a million people helping you? [more, from Business 2.0]
Of course, what we're talking about in games like The Sims Online and in examples like eBay is the bubbling up of properties--emergent properties--that you can get when you stick a few hundred thousand people together and watch the feedback loops appear, which is qualitatively different from something like Cloudmakers, where the design was more top-down, albeit more reactive than before. Then again, the Cloudmakers did spawn (to the audience, at least), a big and unexpected story.
Some quick ones:
In two moves, one vaguely surprising (appointing a woman, seeing as they don't have many internally) and one vaguely not (appointing from overseas), Cambridge has named the Vice Chancellor who will be succeeding Sir Alec Broers:
Cambridge University is to appoint the first woman "chief executive" in its 800-year history, the Guardian has learned. The university is expected to announce next week that Alison Richard, a British academic and Cambridge graduate who is provost of the ivy league American university Yale, will become its new vice-chancellor. [more (via Education Guardian]
Coming on the back of the news of the surprisingly successful demonstrations at Cambridge against top-up fees (I was talking to a few friends about this, and were were singularly impressed at the number who took part, which isn't really typical of apathetic middle class Cambridge students, and the reaction of the university), I'm really quite happy about how the university is doing. Hopefully Professor Richard can shake things up a little more.
Some quick ones:
Via Slashdot, an absolutely hilarious story about Tivo user profiling that cannot fail to anthropomorphise technology:
Mr. Iwanyk, 32 years old, first suspected that his TiVo thought he was gay, since it inexplicably kept recording programs with gay themes. A film studio executive in Los Angeles and the self-described "straightest guy on earth," he tried to tame TiVo's gay fixation by recording war movies and other "guy stuff."
"The problem was, I overcompensated," he says. "It started giving me documentaries on Joseph Goebbels and Adolf Eichmann. It stopped thinking I was gay and decided I was a crazy guy reminiscing about the Third Reich."
He mentioned his TiVo tussle to a friend, who told an executive at CBS's "The King of Queens," who then wrote an episode with a My-TiVo-thinks-I'm-gay subplot. [more (WSJ)]
Some quick ones:
To spammers generally, and in particular to the one idiot who sent me the same spam 35 times in the space of two minutes this morning: please fuck off and die slowly, painfully and screaming.
Tim O'Reilly proposes a new definition of "productivity application":
For years, we've let office productivity applications define "productivity," yet Apple knows that the new frontier of productivity is not a new spreadsheet, word processor, or email client, but rather, tools for managing a consumer's growing array of digital assets: photos, music, and videos. [more]
Jack Schofield found a payphone with an ethernet jack:
Heading for the gate for a flight to Phoenix, I just happened to spot that a Sprint public pay phone had an Ethernet port on the side. Which is not to claim I am eagle-eyed: an American Comdex visitor had a boat-anchor of a notebook PC attached to it at the time. Said American kindly let me use his CAT5 cable (mine was in the hold), so I plugged into the power socket and Ethernet, got an IP address, dropped in a quarter and did an e-mail synch. [more]
Top entries served from this site over the past few days in response to Google queries:
Other popular entries:
I am an Electric Monk. I can hold conflicting beliefs in my head without falling over. Today, the conflicting thoughts I've been holding in my head have been:
What is it with people who don't take the time to learn how to use a computer? It's not as if it's terribly hard. Seriously. Hit the big Start button. No, the Start button. It's big. It's green. It's in the bottom left hand corner of your screen. What do you mean you don't know what the difference is between RAM and HD space? You don't know what processor you've got? No, it's not "Windows".
Computers don't work yet. Don't bother too hard trying to understand what's going on inside them. It's of no consequence. When computers work, you won't need to know. They'll let you do what you want to do.
I side with the latter more than the former. I don't, and shouldn't have to know what processor I have inside my computer any more than I should have to know what IC is inside my phone. I use my phone to call people with. I don't care what operating system my phone uses. It's irrelevant. It helps me to make calls. I don't know what IC is inside my calculator, but it does a whole bunch of stuff. It doesn't not work. It just does it.
The point being: whenever a disrupting technology comes about, when you're an early adopter, whenever this disrupting technology has not yet matured, there is a need to--albeit only a temporary need--to be somewhat knowledgeable about internals. However, that temporary need soon disappears--witness the phone example.
Things I don't need to know about, because they Just Work: televisions, phones, mobile phones (less so, and depending on where on the technology curve the phone is) cars, watches, calculators, dvd players (slightly less than the rest), cd players, walkmen.
I am dependent on all of these things (to the extent that I could be dependent on things like television and walkmen). My mobile phone(s) are an integral part of my life. I organise my social life with them, I organise my work life with them. Everything in between, I organise with them. If my phone broke, I would have to do something about it. Most likely, use the other one, or get it replaced. There is nothing user-serviceable. There should be nothing user-serviceable: the mantra, again, is it Just Works, and if it doesn't work, it's Not My Problem.
Computers don't Just Work. People, though, quite rightly believe that they should. Why hold this belief? Because computers are everywhere. Because we are dependent on them, and because they've started to blend into the environment, the background.
At every other time in history, blending into the background has been an indicator of maturity: mobile phones in Europe and East Asia are ubiquitous and mature, telecoms itself is ubiquitous and even more mature. Microprocessor controlled traffic lights are everywhere and work without fail (because they have to), and the countless cars they shuttle the world over enjoy a far higher reliability rate than computers. "Computing" is nowhere near mature. Computing will be mature when it doesn't break, when it's ubiquitous, when everything is networked and we can honestly believe that it Just Works.
If you have no idea what's going on inside your computer, there's nothing wrong with you. It's to be expected. We've been told that they are tools that will help you, and tools aren't supposed to be quite so unreliable. When every other item of "high technology" manages to work without fail, you can be forgiven for being surprised that someone wants to know what operating system your desktop is using, because you'd be as surprised if someone wanted to know what chipset your DVD player used to decode DVD video (and if your eyes glazed over at "chipset", that's not a problem, either).
Much of Smart Mobs is, of course, speculation. What happens when RFID (radio frequency ID) tags cost as little as a penny and can be installed in every physical object? You could look up Greenpeace product ratings from the supermarket. It would be useful to know, say, that the 8:30 bus is always late and there's time to buy a bagel. You could pay for it with smart money, with RFID tags embedded in it so who owned it and what it was spent on could be tracked. [more]
New Scientist writes about a Microsoft research project aimed at archiving your life--anyone who's ever wanted to Google search their life will love this:
Engineers are working on software to load every photo you take, every letter you write - in fact your every memory and experience - into a surrogate brain that never forgets anything, New Scientist can reveal
It is part of a curious venture dubbed the MyLifeBits project, in which engineers at Microsoft's Media Presence lab in San Francisco are aiming to build multimedia databases that chronicle people's life events and make them searchable. "Imagine being able to run a Google-like search on your life," says Gordon Bell, one of the developers. [more]
MyLifeBits is a project of the media presence group at Microsoft Research's San Francisco Lab. A paper on MyLifeBits, to be presented at ACM Multimedia 2002, will be available online in December in MS Word (1.4mb) and PDF (297kb) formats (and wins points for including the word memex in its title).
Update: new details of the MyLifeBits project (6 December 2002)
Philips announced a new device at Comdex for Home Entertainment--a wirelessly connected component (via 802.11b/a) that will make content stored on PCs accessible to existing AV components in the home. Things that can't listen and talk to other things are boring now. Lose the convergence. Get on the network.
"The buzzword is not convergence, but connectivity--getting devices to talk to one another," said Michael Gartenberg, an analyst with Jupiter Research. "Consumers resoundingly rejected convergence, so these companies have to be careful about making devices easy to use. Consumers don't want...to have to figure out how to get devices to talk to one another." [more]
I was really meaning to write down a little of what Pinker's ideas had triggered, but it looks like Matt's jumped the gun, which is probably a good thing. I'm now spurred to actually finish off the piece I'd started.
An aside: Matt's comment software is really good. Up until now, no one's really bound emails to weblogs automatically, instead making do with popup windows (lag! people pressing buttons tens of times just because the page hasn't refreshed or the form hasn't submitted or or or...). This is great, though. Send an email and:
Emily MacDonald, a PhD student in astrophysics at Oxford University, was chosen ahead of 450 applicants to spend three weeks at the Arctic base last summer.
Ms MacDonald, 25, from Troon, Ayrshire, says: "The equipment and the research units the Mars society create are likely to be very close to what is eventually actually used to go on a mission to Mars, so this is very valuable research and Euro-Mars will be a great boost for us." [more]
It's taken a surprisingly long time for the Conservatives to push for this, but they've finally done it: Oliver Letwin was on the Today programme this morning outlining the case for a cabinet minister for Homeland Security [BBC News, Today on Radio 4, The Guardian, ], presumably after an amiable chat with Tom Ridge, head of the Department of Homeland Security in the US.
It's not entirely clear what Letwin--or the Conservatives--actually want. On the one hand, they're calling for greater deregulation, decentralisation and a generally laxer style of government intervention. On the other hand (and perhaps typically, when it comes to security), they're advocating creating a new cabinet post who, we assume, will have the power to go over the other cabinet ministers. Blunkett, on the other hand, pointed out that a lot of the US's Department of Homeland Security's bureacracy (spending well over £1billion, with over a thousand employees) went some way towards recreating what the UK already has--a domestic intelligence service.
Oh well. Things are looking up (a little) in the US, where s.880 of the Homeland Security Bill has prohibited the implementation of the Operation TIPS part of the Citizen Corps--with the result that the Operation TIPS website has vanished (via Boing Boing).
The Guardian takes delight in reporting that Tourism Minister Kim Howells' assertion that "most Americans are so ignorant of their kinsfolk in Britain they believe the UK is a far off country "somewhere" in the Middle East":
"Very often people do not understand the title of the country," Dr Howells told MPs of Commons culture select committee yesterday. "In America, people had heard of London, some had heard of England, no one had heard of the United Kingdom - they thought it was somewhere in the Middle East." [more]
The Guardian would do well to improve their fact checking. Later in the article:
England and Britain are well known brands, they said, but the UK hardly registered. Even high-placed Americans struggle with the title. Foreign Office mandarins had to have a quiet word with American diplomats in 1998 after the CIA world factbox said that 1801 marked the date of the UK's "independence". (emphasis added)
Er, that'd be the CIA World Factbook, no?
Then again, this is the paper that's earned the nickname The Grauniad for its sensational typos.
SFR: Which culture will assimilate which? Communism, Capitalism, or Islam?
Ken: Capitalism will assimilate everything that exists in the world today, no question. The interesting question is what happens then. Professor Meghnad Desai of the London Schoolo of Economics has recently written an interesting book called Marx's Revenge, in which he argues that what happens then is that capitalism begins to press hard against its limits, and socialism comes on the agenda for the first time.
Ken: Islam is a religion, not a mode of production, and is not counterposed to capitalism. Communism is a potential mode of production which, in the words of Lenin, 'requires the joint efforts of several advanced countries, which do not include Russia'. Well, today Russia is arguably an advanced country, but it could only reach socialism through joint efforts with other advanced countries. Stalin's 'socialism in one country' was always a utopia, and a reactionary one at that. There was never the slightest chance of the Stalinist states assimilating the capitalist countries. Nor is there the slightest chance now of the Islamic countries assimilating or overwhelming the largely secular West. [more]
Brute force is great - today IBM announced that it had won a $290m contract with the US military to develop, amongst others, ASCI Purple. ASCI Purple:
"... is expected to have 196 interconnected 64-processor servers, making a total of 12,544 Power5 chips. It will come with 50 terabytes of memory--about 20,000 times as much as a PC. The supercomputer also will have IBM disk storage arrays holding 2 petabytes, or a quadrillion bytes, of data--about 50,000 times the capacity of a PC." [more]
The Times reports that, in terms of sheer performance and number of calculations per second, ASCI Purple is roughly equivalent to a human brain (see also information processing in the human body). Ray Kurzweil points out that thanks to the law of accelerating returns, we can reasonably expect something in the hundreds of petaflop range for around $1,000 by 2032.
ASCI Purple clocks in at the 100 teraflop mark, BlueGene/L in at the 360 teraflop mark (what are all these flops?). Of course, we're going to have to figure out how to use all that processing power--I'm of the opinion that anyone who thinks the singularity is a hardware problem is mistaken - we will obviously have tremendous amounts of processing power. Processing power on its own just doesn't help us in reproducing intelligence, though.
Just needed to keep these in the same place for a while:
The Pathe newsreel archive has been (more fully) launched, with 2,000 hours worth of low and high-resolution video downloads available in WM8 format (Microsoft's version of MPEG-4). The seem to have a bizarre policy (though perhaps understandable) on sharing the free material that they provide:
We encourage users to share Preview Files that they download with colleagues and friends around the world via e-mail. We would ask that this is not achieved by publishing Preview Files on-line. The only instance where we feel this might be appropriate is within a closed user group in an educational environment.
Putting social pressure on spammers:
After considering it, I have decided not to sue. And I'll tell you why. It won't change anything. I have successfully sued in court twice and I won both times. (Those are two longer tales that are not relevant to this story, so I will not go into it. Suffice it to say, these people were scum.) All that happened was that I spent thousands in legal fees to achieve a victory which resulted in a cash settlement which was never actually collected. No, I will not sue the Bannon's in court. However, I will not rest until their site is shut down and they stop spamming. I will do everything in my power, using ethical and legal means, to force them to shut down, to complain loudly and to shame them into quitting their spamming venture. I am going to make sure this spammer never sends out another piece of spam again. [more]
So, we know that MS is ready to push smart objects (now known as SPOT--small personal object technology), but one of the other things the unveiled at Comdex this year was Microsoft Office OneNote, a new note-taking program for MS Office.
My first reaction, after having a look at the blurb and the use case scenarios is nothing more than a gadget-lusting "I want one". Then again, that's not to be surprised, really. I want most gadgets. There's something about One Note, though--or Tablet PCs--that's alluring and that a lot of geeks have dismissed.
One is that all this pen-input stuff is a complete waste of time: who writes nowadays? If you can type at 98wpm, much faster than you can write, then there doesn't really seem to be much point. If I'm honest, there are still occasions when I do have the strange archaic urge to splurge ink all over a page. Lately, they've become more or less concentrated when I'm sat in a lecture hall and I'm taking notes.
Hang on a second. I've got a laptop--and a nice one, too. I have word processors and outliners falling out of my ears. It's not as if I'm stuck for choices of note-taking. Or am I? It seems that I am. For a long time (granted, this was before I'd bought myself a laptop), I scorned people who took them into lecture halls. Mainly this was at Cambridge and during law lectures, and a few years ago you'd get the odd student lugging a ThinkPad in, waiting a few minutes for it to boot up, getting embarrassed when Windows went "TA-DAAAA" when she logged in and wondering why, exactly, Word had decided to crash when she most needed to get down a precedent. "See," said I "I wouldn't want to take notes on a laptop because I'd be scribbling things down in margins, I'd be wanting to draw loopy arrows all over the place, putting brackets in after the fact, crossing things out and so on. Typing in bulleted lists in Word or in free form is simply not the way to go", and I'd wander off in some sort of smug cloud whilst silently cursing that I didn't have a laptop to play with.
Well, I've got a laptop now and I still go to lectures. What's good? Well, it's good that I have all my notes in one place and I'm not likely to lose them. It's good that I can print out lots of copies. It's good that there's this wonderful thing called "search" that I can do on some flat text that doesn't involve a whole bunch of extraneous and terribly complicated sequence of saccades. All good things.
So what's bad? Why do I gaze at the student use scenario so forlornly? Because every so often, I need to draw Nassi-Shneiderman diagrams. Or I need to do a quick sketch of a binary tree. Or a parse-tree for some given BNF. Sure, I've got reasonably intelligent drawing software that has a bunch of predefined templates so I can draw UML and so on, and rumour has it that Word itself allows you to actually draw primitive shapes on the page nowadays, but what I'm really longing for is the ability to, say, just quickly sketch something and be able to keep it.
Keeping all this electronically requires discipline. There's a whole directory structure of courses, modules, notes, weeks, pdf lecture handouts, Word lecture notes, PowerPoint presentations, java sourcecode all intriciately laid out so I know where I can find it. Then Microsoft comes along with this One Note business and says that I don't have to bother with that anymore. And rightly so. I'm dying out for a filesystem with metadata (cue salivating at BeOS's filesystem) where instead of worrying about where I put something so I can find it, all I have to do is worry about what it is that I'm looking for. I'd love to tag a bunch of files as lectures notes and just view an abstract collection of files that happen to be lecture notes--virtual folders! One Note is moving slowly away from Office's traditional single-document model. Fine, Office has its Binders, but not many people I know use those, and they're rather unwieldy--too little is bad, not good--but look at this:
* Create as many notebooks, folders, and pages as you need; customize them for how you work; and keep them all in one convenient place. * Search your notes quickly and find what you need without having to know in which notebook, folder, or file you saved that information. * Create and search Note Flags to highlight urgent action items. Your important tasks and reminders will always be easy to find. * Combine, format, and rearrange notes by using a drag-and-drop operation to organize your notes in a way that makes sense to you.
Oh. My. God. I'll wait for metadata enabled filesystems. I'll wait and see how MS does with its SQL-backed filesystem. But this, right now, is close to a killer app. Make a notebook for a project. Organise it with separate folders. Don't worry about what's inside those folders, move the pages inside the folders about, customise them and--joy--search them. Draw in them! Put free-form text in! Drag and drop! It's like some kind of note-taking heaven.
That's why I want this. I don't care whether my handwriting is terrible or not, or whether it's slow. When I need to input large amounts of text, I can type. When I need to quickly annotate or sketch, I can draw. When I need to organise a whole bunch of documents together, I can, and I think that's going to be the killer feature. I don't like having a hierarchical structure that I must obey to store things efficiently. I don't work that way. Humans don't work that way. I want an amorphous blob of things that are like other things, and I want it all in one place. One other thing--you'd think that someone somewhere would suggest that this software be tied to Tablet PC, that only those shelling out huge amounts of cash for these new computers would be able to reap the benefits of actually being able to organise documents. Well, you thought wrong. One Note will work with any Windows PC, pen-enabled or not. Dear God I want this.
Update: Steven Johnson talks about OneNote and likes it for pretty much the same reasons.
Update: If you're coming here from Microsoft, say hi!
With Gates ready to unveil smart objects at Comdex (network aware devices such as the oft touted alarm clock that knows about traffic conditions so can wake you up that bit earlier to ensure you're still in time for work--probably not the best example to use to sell a gadget to people), Fossil have shown a prototype Palm OS wristwatch:
Fossil's Wrist PDA with Palm OS has approximately the same features as Palm's recently introduced Zire, including 2MB of memory and a 160-pixel-by-160-pixel screen. It even comes with a stylus and, like other Palm OS devices, can accept Graffiti text input. However, because the pixels on the wristwatch screen are necessarily tiny, Fossil has rewritten some core Palm programs to use larger fonts or fewer icons and has added a jog-dial switch to make navigating the programs easier. The company says most Palm programs will run on the combo device, though some may benefit from being optimized for its tiny screen. [more]
Via Gizmodo, Time lists the best inventions of 2002, including: the virtual keyboard, date rape drug spotters (that don't work), scramjets (not a new idea, just a tested one), aerogel (old, but to be used soon for capturing cometary fragments) and, bizarrely, 3D online environments (not too behind the ball there).
With the (old) news that EA struck a deal with McDonalds for the placement of McDonalds kiosks in The Sims Online, Shift.com produces a call to arms:
Wired are now staggering the publication of the print magazine online: they used to release everything in one go on the same day (at least, I'm pretty sure of that), but now articles are published on a range of dates.
Reasons why the interwebnet is great, 232 in a continuing series: tell people if you're on the FBI's No Fly List (via Boing Boing). See also: Everyone Watches Everyone and Brin's The Transparent Society.
Everyone's linking to Steven Johnson's weblog, he most recently of Emergence: The Connected Lives of Ants, Brains, Cities, and Software fame. Lately, he's been playing with Google (who doesn't?).
One of the main reasons why I chose an Apple laptop over a Vaio was because of the battery life and the absolutely phenomenal sleep/resume speeds. Seriously. Close the lid. It goes to sleep. Open the lid. It's awake before you've finished pulling up the lid. This has never worked for me in Windows, XP included (not to the extent that I'd actually trust it enough to send a system to sleep). Being able to just close it at the end of a lecture, stick it in my bag and resume five minutes later is how laptops should work. I don't want to be starting up and shutting down the whole time. I don't want to be waiting while my system state is read from a 512mb file on my HD back into RAM. If Apple can switch to AMD or Intel and keep the same battery life and sleep/wake features on its laptops, then I'll be happy. Last I checked, AMD and Intel mobile processors weren't doing that well in the power consumption stakes.
The Guardian talks about news media catering to 18-34 year olds and mentions Comedy Central's The Daily Show (or, more accurately, The Daily Show With Jon Stewart):
Since September 11 2001, the show's guests have increasingly included heavyweight journalists, academics and politicians - expertly steered between the twin hazards of being boring and humiliating themselves in trying to be humorous with Jon Stewart, the show's host, who is a stand-up comedian by training. "More 18 to 49-year-olds get their news from the Daily Show than any other cable news programme," Comedy Central noted in a Wall Street Journal advertisement earlier this year. "Heaven help us," the ad concluded. [more]
I watched a bunch of The Daily Show while I was out in the 'States over the summer: it's kind of a blend of The Day Today (amusing animated infographics), Newsround (important news, lite) and Have I Got News For You (biting sarcasm and commentary). It's also completely hilarious. I mean, really. When I went up to New York to have a quick look round and meet up with some friends, I managed to get to a taping of the show (cunning planning on behalf of the people I was with) on the day they'd found out they'd got an Emmy nomination. I have never laughed so much, so loudly and in so small a space with so many cameras to either side of me before.
Browsing memepool and seeing a reference to etaoin shrdlu (the two leftmost rows on a Linotype typesetting machine's keyboard), I was reminded of shrdlu, a program for understanding natural language. Check out how shrdlu got its name.
Not only did Brent Simmons release an updated version of NetNewsWire Lite that fixed a performance bug that appeared the day before (see change notes), but he's also started posting screenshots of Pro version features.
First three apps loaded on login:
After having mistakenly read the BBC News Online story "Studios unveil 'films to download'" as "Students unveil 'films to download'" and thinking, "gosh, that's terribly quick reporting of you there", the natural thing to do was to check out Movielink, the consortium of five Hollywood studios offering film downloads. Fine, the BBC News article warned me that I wouldn't actually be able to do anything--the service is only available in the US--but hey, I'm interested.
Instead, I get this.
Thank you for your interest in Movielink. We want you to take part in the powerful Internet movie rental experience that Movielink delivers, but it is presently unavailable to users outside of the United States.
Well, I knew that. So, you'd like me to take part, but right now, I can't. Because of that, you're going to offer me no information whatsoever? Wonderful. "Hey! HEY! There's a product here that interests people! We can't give it to them, but hey, they don't mind. While we're at it, let's not tell them anything! We won't even tell them what we do! Or what we charge! Why would anyone want to know anything like that? They can't get our product!"
Oh, wait, they did give me some information. They offer a powerful Internet movie rental experience. To people in the US. Well, that's me sold. I'll remember that.
Idiots. Wait, Hollywood Studios spawned the MPAA. That explains everything.
Wired blurb on The Sims Online:
With The Sims Online, a massively multiplayer game that goes live in November, Wright plans to take over the rest of the world. The basic playability is the same — each Sim pursues primary needs such as food, sleep, a social life, and the shortest path to the bathroom. But the online version dispenses with AI. Every Sim will be controlled by a flesh-and-blood person. Like EverQuest and Ultima Online, The Sims Online will mix elements of strategy games with an immersive social experience. And along with the forthcoming Star Wars Galaxies, it will test whether such games can cross over from the Renaissance Faire set. "Our game," Wright says, "is potentially bigger." He aims to have half a million players by the end of the year. [more]
The Guardian reflects on how mobile phones have changed society:
"By the year 2000, Mintel suggests that small pocketphones "will be as common as Walkmans... " People would have to develop a whole new social code... You could not, for example, take calls in the middle of a crowded restaurant. Indeed, the potential nuisance effect of pocketphones (which, of course, exist at the moment, but are clumsy and extremely expensive) is enormous, though perhaps no more so than the nuisance of the transistor radio. Besides, the social value of being able to make a phone call at any time will also be extremely large."
The Guardian, May 6 1986 [more]
By 2003, removable solid state media of the type used in digital cameras and mp3 players will reach between 1 and 2 gigabytes in capacity (Memory Stick 1, 2, Compactflash), which isn't surprising at all, really. What's fun, though, is thinking of new ways to use all this capacity in New and Exciting ways--Matt Webb and Steven Frank both came up with the idea of Conversational Tivo - a random access audio stream of your life [Webb 1, 2; Frank].
We don't have to use 3G anymore, or at least, we don't have to use it exclusively: we've now got enough storage space to simply spool to removable media if the bandwidth isn't there or we're out of range of a base station (heaven forbid we're stuck in a tunnel or in the middle of nowhere).
Without further ado, I present a stopgap product:
About the same size of a Sony memory stick walkman, the Sony Daycorder has:
On the other hand, I'd be perfectly happy with a streaming 3G conversational tivo as well...
.... .- .--. .--. -.-- / -... .. .-. - .... -.. .- -.-- / - .. ..-. ..-.
/ -.-- --- ..- .-. / -. . -..- - / .... .. -. - / .. ... / ... .-..
.- -. - -....- -... .. .-. -.. .. . .-.-.- / - --- -- / ..-. . .-.. - /
- .... .- - / .- / .-.. .. - - .-.. . / -- --- .-. ... . / -.-. ---
-.. . / -.-. --- ..- .-.. -.. -. .----. - / .... ..- .-. - / -.-- ---
..- .-.-.- .- ... -.- / -.- .-. . -- .--. .- ... -.- -.-- / .- -... ---
..- - / - .... . / -. . -..- - / -.-. .-.. ..- . .-.-.-
Logging in to my Epinions account for the first time in months, my personalised page tells me that they'll help me "Find the Best Products for Your Baby". Which, I have to admit, is news to me.