Contents of RSS feed for:

Dan Bricklin's Log
VisiCalc co-creator Dan Bricklin chronicles his life in the computer world with pictures, text, and commentary.
Voice and Command Line User Interfaces
I have been seeing the increasing popularity of voice-controlled "smart speakers" like the Amazon Alexa series, including heavy use among many of my close relatives and even their very young children. Attending a Gartner conference, I watched analyst Jason Wong gave a presentation about "voice first" apps, showing an example from Rhino Fleet Tracking and explaining Amazon's recently announced "Alexa for Business" offering. This was something that could be used for internal business applications, not just for personal home use. Hmm. Internal business use if my area of interest as CTO at Alpha Software.

By the time I reached the airport for the ride home, I had checked out some of the Alexa documentation and saw that it would be pretty easy to hook up Alexa to query a database to explore this area. After a little work back home, I found that connecting to the REST API of our new Alpha TransForm mobile forms data capture system would be even easier. I soon had a demo and made a short minute and a half video "Simple business app with Alexa":

Video on YouTube

I then got to discuss the video with a variety of analysts and others and think about this area a lot. One thing I was struck with was the similarities between the Alexa-style voice input and traditional Command Line Interfaces (CLI). CLI, first popularized in the computer world back in the 1960's or so and continuing today in Linux, Windows, etc., has always appealed to many professional developers despite the advent of GUI and touch-based interfaces. There is something special there. Perhaps voice and CLI share some attributes that would explain the rapid rise of its popularity in smart speakers. I decided to write an essay exploring this.

The essay I came up with is "Revenge of the CLI: A transformation through voice and AI". It's in the Writings section.
Published: Sun, 28 Jan 2018 20:42:32 GMT
My talk on TED.com
I'm excited that a TEDx Talk I gave last November has been posted as a TED.com video. The talk is a short retelling of the VisiCalc origin story. It's a much-condensed version of a talk I've been giving for many years, updated and enhanced. I'm very happy with how it came out.

Watch the TED.com video: "Meet the inventor of the electronic spreadsheet".

TEDxBeaconStreet, where I gave the talk, is part of the TEDx program. It is an independently-organized TED-like event, named after "...Boston’s longest street, uniting towns and cities, neighborhoods and businesses, schools and families along the way." After they are given, TED picks up some of the talks at TEDx events and posts video from those talks on TED.com.

As I've often done in the past, I wrote an essay for the Writings Section of my website chronicling the process I went through to turn my long, rambling slide show into a tight, 12-minute TEDx-style talk. Viewers of the video may find the essay of interest. It was really interesting how helpful experience with crafting understandable, 140-character tweets on Twitter were for writing a smooth, short talk. I also found the ease and non-obtrusiveness of using popular recording devices (specifically, a GoPro Hero Session and the Voice Recorder app on my iPhone) for capturing feedback when practicing in front of others to be important.

Read "The story behind my TEDx Talk" in the Writings Section.
Published: Wed, 11 Jan 2017 17:01:35 GMT
The Frankston Challenge
This morning I went for a normal check up with a healthcare provider I hadn't visited in a few years. The receptionist handed me a clipboard with paper forms to fill out to "catch up" -- nothing pre-filled out. Luckily, I was given a very sharp pen -- the fields were often way too small (when you live in "Newton Highlands" you need a lot of room for "City"). When I got to the section on "Pharmacy" I remembered that that office had sent a prescription a previous time to the wrong location (same street and company, wrong city and address). To make sure this time, I needed the address and phone number. I took out my cell phone to look it up. One bar. Very slow and dropped connectivity. I found the address but gave up on the phone number. I saw across the room a little sign with "WiFi" written on it. I guess I could have gotten up and followed the info to connect to their wireless router, if I trusted it, but I was too lazy. I left the field blank.


This was a reminder of what I call "The Frankston Challenge". As Bob Frankston has written repeatedly over the years, we need to have ambient connectivity that works everywhere without a prior relationship or human intervention to get past sign-in screens. (For example, see: Understanding Ambient Connectivity.) Yes, there was Internet connectivity where I was (in a normal office building in a major suburb on a busy street right near I-95/Rt 128), but it wasn't good enough or easy enough to answer a simple question quickly. For the laptops that the doctors and nurses used it was no problem -- they had already keyed in the special codes. For a new mobile device walking into the office, it was trouble like waiting for an old dialup modem to connect or nothing.


In the world of the Internet of Mobile Things (IoMT) you want to be able to just connect. Mobile Things include smartphones, tablets, wearables, automobiles, and so many coming things. They move with (or on) you, or on their own. The connectivity environment around them changes. The Frankston Challenge is very real for them today.


For example, you can't, as Bob points out, require a pacemaker or wearable reporting info to doctors and other systems to be pre-authorized with wireless carriers and WiFi access points everywhere the person might go. You don't want to have to pull out a keyboard (and maybe a credit card) to authorize connectivity when you feel tightness in your chest.


This problem won't be solved overnight. Bob has been working constantly to bring this to people's attention. For me, though, I'm in the world of making systems for business people to build apps that run on tablets and phones that move around with the users. We can't wait. This means that we won't have reliable connectivity everywhere, even where you would expect it. We have to build our systems to tolerate loss of connectivity. A consultant may be visiting a client's retail site or factory, or a repair person out in the field fixing a pump or transformer. The apps they run that replace the paper clipboard with something better and more tied into the flow of their tasks must be able to run offline.


We at Alpha Software have been working steadily to make disconnection-capable apps easier to build so that can be the default configuration. This is especially true for data capture applications. Not necessarily ignoring connectivity when available, but not becoming useless when it isn't.


A final irony. The office building where my doctor worked was the same one were my old company, Software Arts, had rented some space back in the early 1980s. It was across a parking lot from our main building. We used a centralized computer for timesharing to run our whole business. We had a very early Ethernet system installed to give "high-speed" connectivity to the terminals on everyone's desk. There was coax cable everywhere. (This was very early in the history of Ethernet and the IBM XT was just coming out.) To connect the "remote" office hundreds of feet away, we had a trench dug under the parking lot (and then paved over) to run cable to that other building. We had high-speed connectivity there. Today I parked within a few feet of that trench. Connectivity was still a challenge over 30 years later.
Published: Tue, 10 Nov 2015 23:32:42 GMT
Long time no blog
It's been a long time since I posted here on my blog. I've written a couple of essays that appeared on the main Bricklin.com web site, and I've tweeted a lot (as @DanB), and even did some podcasting as part of the Adventures in Alpha Land podcast series, but not much here. My day job at Alpha Software has continued to keep me very busy with other things.


You may have missed my Alpha WatchBench app for Apple Watch. See "How Alpha WatchBench Came About" on my main web site.
Published: Tue, 10 Nov 2015 23:30:54 GMT
Essay about issues and techniques related to disconnected mobile apps
We're spending a lot of time at Alpha Software working on making it easier to create mobile business apps that support sometimes- or frequently-disconnected operation. As part of that work, I've learned a lot about how this issue is an impediment to widespread mobile app deployment in business and also about many of the areas that must be addressed in order to do it well. I've just posted an essay that covers a lot of what I've learned.

You can read "Dealing with Disconnected Operation in a Mobile Business Application: Issues and Techniques for Supporting Offline Usage" in the Writings section of my web site.

[Image in the original post on Dan Bricklin's Log with caption: The Offline Problem: Some common images of low connectivity]
Published: Tue, 15 Jul 2014 19:22:56 GMT
HTML5 First: Google innovators prototype in the browser
Two days ago, on Wednesday, I watched the Google I/O Keynote live stream. Early on in the two and a half hour mega-presentation they introduced their new "visual language" for UI design: Material Design. It has "materials" inspired by paper with shadows, bold colorful graphics, and "meaningful" motion/animation.

As a developer who makes heavy use of HTML5, what immediately struck me was this statement, made starting at about 20:20 into the video:

"Last year at I/O we announced Polymer, which is a powerful new UI library for the web.

Today, we're bringing you all of the Material Design capabilities to the web through Polymer. [Applause] As a web developer, you'll be able to build applications out of Material Design building blocks with all of the same surfaces, bold graphics, and smooth animations at 60 frames per second. [More applause, followed by the speaker smiling and ad-libbing: "That was good..."]

So, between the L preview and Polymer, you can bring the same, rich, fluid Material Design to every screen."

Wow, I thought, Google not only designed a mobile UI for their Java-driven devices, they went to the trouble of also then building it in HTML5 for web apps (mobile and desktop).

I was wrong. I did some looking at the documentation for Polymer, and in the FAQ I found this (emphasis added):

How is Polymer related to material design?

Polymer is the embodiment of material design for the web. The Polymer team works closely with the design teams behind material design. In fact, Polymer played a key role in material design's development: it was used to quickly prototype and iterate on design concepts. The material design components are still a work in progress, but will mature over the coming months.

So, the HTML5 version wasn't created after the native versions. It was the prototyping environment before the native code.

This is a great model to follow: Prototype, iterate, and even first ship, in HTML5. Once you know what you need, if necessary, take the time to do native code. This doesn't just apply to the old desktop (as it has for years). It also applies to today's polished, fluid mobile world.

As a developer who is closely connected to a system that produces HTML5, and that aids in rapid prototyping, I was delighted. Here are some of the leading-edge mobile developers, and they found that HTML5 has the power to do what they want, and do it quickly enough for the demands of the iterative design and testing that is so important in the mobile world. After hearing so many people claiming that "they hear" that HTML5 doesn't have the power for serious mobile applications, it was vindication to hear of people who actually build things choosing to go the HTML5 route, even when cost clearly wasn't the object, and succeeding.

To be fair, the Material Design developers did depend on the latest browser versions, which have built-in support for hardware-accelerated animation and compositing, and on a sophisticated JavaScript library to make coding easier. However, most mobile devices are now getting upgraded frequently to the latest browsers (Google announced better upgrading for Android in the same keynote and Apple brags about the large percentage of iOS users who run the latest version) and most mobile HTML5 developers and systems (the one I use included) make use of such libraries already.

I guess it's time for a new moble-related tag: #HTML5first.
Published: Fri, 27 Jun 2014 18:24:21 GMT
A new app from Dan and Alpha: AlphaRef Reader
For various reasons I decided that I needed an app tuned to reading reference material. I've released a new app that demonstrates addressing this need to the Apple App Store and the Google Play Store: AlphaRef King James Version of the Bible. This is the text of the Bible presented in a new reading environment that I created along with my daughter Adina, a UX and graphics designer. It is implemented using Alpha Anywhere and delivered as an HTML5 app in a PhoneGap wrapper.

To read the whole story, and to see a video of the app in action, read "AlphaRef Reader: Tablet-first design of an app for reading reference material" in the Writings section of my web site.
Published: Tue, 20 May 2014 16:29:54 GMT
BCS Mac Introduction Meeting video is now available -- how that came about
The January 30, 1984, video of the introduction of the Macintosh to the public at a Boston Computer Society General Meeting is now available online. Harry McCracken, an editor at large at TIME for personal technology, posted an exclusive look at the video early this morning, right after excerpts where shown at the Mac 30th event in California (an "official" event marking the 30th anniversary of the original Mac announcement at the same place where it occurred). The Computer History Museum will be posting material on its web site tomorrow, and will update the video when additional processing of it finishes.

See "Exclusive: Watch Steve Jobs’ First Demonstration of the Mac for the Public, Unseen Since 1984" on TIME.com.

Harry explains how there were other videos of prior deliveries of Steve Job's talk introducing the Mac, but that this one is special for several reasons, some of which I have touched upon in my blog post Friday about it, such as the Q&A with a general tech audience. He also explains a bit about how the release of the video came about. Let me fill in a bit more.

These were not fully "lost" videos. I have had my copy of the January 1984 VHS tape for years and let others know I had it, showing excepts sometimes when I gave talks. Given the use of Apple material, and the special nature of the video, I did not feel it was time to release it yet to the public as I was slowly doing with other videos I have (not BCS ones). I was in contact with the videographer, Glenn Koenig, who shot most of the BCS meetings on Software Arts' behalf over the years. He let me know that he had masters of many of them, though he hadn't cataloged them (and didn't search out the Jobs one until Harry asked, which yielded better versions of parts of the presentation). I let him know of the material that I had made sure made it to the Computer History Museum. I encouraged him to continue to maintain his tapes in good condition, which is what he has been doing all these years.

We always intended to complete the edits of this material, but there is a lot of it, and digitizing it properly and editing it takes a lot of time and money. We didn't have the funds to do that and it was unclear if others would chip in and how to bring that about. Doing the work prematurely with the wrong equipment could compromise the tapes. (Of course, waiting too long could cause them to deteriorate too much.)

When Harry, a BCS member going back to the VisiCalc days, contacted BCS founder Jonathan Rotenberg in the fall of 2012 to ask about video from BCS meetings, Jonathan got him in contact with Glenn. Glenn, Jonathan, and I brainstormed about ways to finally make the restoration and release happen. I contacted Ray Ozzie, who I knew was interested in historic videos and had worked with us at Software Arts in the early 1980's. He, like Harry, suggested the Computer History Museum as a means for funding instead of something like Kickstarter. I have been a long-time supporter of the Museum, and have been inducted as one of their "fellows" for my work on VisiCalc, and knew this was the right way to go.

Eventually, Jonathan, Glenn, and I started working with people at the Museum, cataloging which material we had between us, and worked up a budget and a plan for fundraising. Glenn pointed out the upcoming 30th anniversary of the Mac as a good "news hook" and that a hoped-for article by TIME could help publicize the entire set (thank-you, Harry, for coming through with a great one!). We met periodically on the phone (we all had other jobs to keep us very busy) and eventually crafted a "request" letter and a list of potential donors. Glenn produced a short trailer of excerpts from a few of the tapes using some refurbished equipment to show those people.

In September of last year we had the material ready for fundraising and I hand-signed (and in some cases added a little note to) a few dozen letters and mailed them out. Within a little while some money trickled in. Then, Brad Feld, a friend from the early Trellix days now living in Colorado, contacted me for more information. Within a few days he and his wife Amy became the major funders, ensuring that the project was well off the ground. He also helped us by reaching out to some of the people I had approached, too. We have since raised enough to do this project well, and hopefully even do additional tapes that are surfacing, though the Museum could always use additional funds for such things.

The people producing the video for the Mac 30th event, most importantly Gabreal Franklin, helped Glenn work out details and ensured that there would be quality video for their event, both from the January 1984 video and some other.

For many reasons, including the difficulty in getting the slides shown the way we need them, the current version of the Mac Introduction is just a "rough cut". There is more editing that will occur. In any case, Glenn did many late nights to make this happen on time.

Now that everything is set up, we will be processing other videos, too, and release them in batches. I'll write more about them when they are available. To me, some are of them are of the same historic value as this one, if not more.

The Mac Introduction video has served its purpose well. It has drawn attention to the Boston Computer Society and the other videos, and to the Museum. It has helped a new generation see what happened in the early days of the PC revolution, and an older generation remember their past, and helped chronicle the thinking of the time for posterity. It has also helped fill in the record of the life of a giant of our industry, capturing an interaction with the users of his works like no other.
Published: Sun, 26 Jan 2014 20:55:46 GMT
The Mac turns 30 and because we preserved the history you will be able to relive it
Today is 30 years since the Apple Macintosh was announced. That event occurred at a shareholders meeting in California. The following Monday, Steve Jobs and a crew of people from Apple traveled to Boston, Massachusetts, to redo that event for the public. It was at the Boston Computer Society General Meeting on January 30, 1984, at the John Hancock Hall in Boston.

In addition to Steve's speech and demonstration, he also brought many of the developers of the Mac software and hardware with him. After his presentation, they all sat on stage and answered questions from the audience and did further demos. For the over 1,000 people who attended, it was an amazing event. At the shareholders meeting there was the worry of how it would be received and if the demos would work. At the BCS meeting there was much less pressure and Steve was relaxed and confident and engaged the audience. The attendees were knowledgeable and savvy.

There is something else about the Boston event: My old company, Software Arts, at my suggestion as I recall, was paying to have video tapes made of many of the BCS General Meetings. (I was a board member of the BCS at various points). For this event, Apple provided additional cameras and made sure the lighting was good (it was not good in California, apparently).

We at Software Arts ended up with an edited VHS copy of the event that I have kept for these 30 years and sometimes show excerpts from when I talk to students to show them what it was like in the "old" days. I also would show it to my daughters to let them see how a real pro delivered a speech. At this point I know Steve's intonation on every syllable of the start of his presentation. (It's different than when he gave it in California.)

What happened last year is something wonderful: It turns out that many of the original BCS meeting tapes still exist. Some were donated to the Computer History Museum with my help. Many of the master tapes were still with the videographer, Glenn Koenig, including 3/4" tapes that are higher quality than the VHS copies. Together with Glenn and Jonathan Rotenberg (the founder and initial head of the BCS), we finally started a project with the Computer History Museum to restore the tapes and make the videos available to the public. This involves careful work on refurbished old equipment and careful editing remembering the events themselves to get the best possible video and sound. It will also include transcripts and other related museum-type treatments.

We have raised money to pay for this (there are over 20 meetings to process) and have started digitizing and editing the tapes. The Mac Introduction will be the first, but there are others of great interest with the leaders of many leading personal computing companies, from Microsoft to Radio Shack to Digital Research to Lotus and IBM.

A thank you to Brad Feld, who remembers the impact the BCS had on his interest in both technology and entrepreneurship from his days at MIT, and all of the others who came through with the money we needed!

The Mac video will be released soon (I'll post here when it's available) and others will follow over time.

[Photo of Steve Jobs about to insert disk into Mac in January 1984, from my VHS copy appears here in the original blog post on Dan Bricklin's Log]

For those of you who have only seen the young Steve Jobs portrayed by actors, seeing him and his team as they actually were should be a real treat. I get chills down my spine watching it.

It is such a joy to realize that a decision we made over 30 years ago at my old company, when the personal computing industry was a young oddity, will bring those days to life for a new generation and for generations to come and that they will care and appreciate it.

(This isn't the first time I've felt that way. Another recording we did, an internal one of a staff meeting as the IBM PC was being announced, ended up a key scene in the PBS documentary "Triumph of the Nerds" -- see "IBM PC Announcement" on my web site's Writings section. I've also posted a few other old videos I have: See "Video of Fall Comdex 1983" and "Video of Bob Frankston's tour of the WWW in 1994".)

I recorded a 20-minute podcast interview with Jonathan yesterday to get the back story on how he, a 20-year-old at the time, got Steve Jobs to debut in Boston. You can listen to the MP3 recording of our Skype conversation, "Jonathan Rotenberg about the 1984 BCS Mac event 2014-01-22" on my podcast channel.

In those days, the top companies in the industry would come to the BCS and other user groups to demonstrate their new innovations, sometimes as their first public showing. (The BCS was the most influential and probably biggest such group.) Local and national TV didn't show up to document the events (though the Wall Street Journal and other press often did). However, "personal" video taping helped preserve some of it. I know that other user groups did taping (I have some tape of me at a least one of those venues) and I hope that other material has been preserved and will be made available to the public and to historians through the Museum and other means.
Published: Fri, 24 Jan 2014 19:44:48 GMT
MassTLC unConference session on Responsive App Design
Just as it has for the three years before, this fall the Massachusetts Technology Leadership Council ran its "Innovation unConference". They have a web site, www.masstlcuncon.org, that explains it all, and I posted lots of photos to Flickr as I have in the past.

This time, I proposed and moderated a session on Responsive App Design. I started things off by showing my video on the topic to get everybody up to speed with what I meant by the term. (On this blog I had called the topic "Responsive App Layout", but I've since switched to the more common word "Design", use the URL [www.responsiveappdesign.org www.responsiveappdesign.org] for the companion web site, and redid the video to better show the example app.)

In the session, we had a lively, useful discussion, which one of the attendees kindly captured on a flip chart that the conference had provided in the room. There were things worth sharing, so I'm reproducing what was written with a little embellishment:

- Contexts / Break Point (what to take into account for different designs)
-- Width / Columns / Landscape / Portrait
-- Sighted / Blind / Color blind
-- Mouse / Touch
-- User standing / sitting / walking / computer mounted on a wall
-- Language orientation (e.g., left to right vs. right to left)
-- One-handed use vs. two-handed use
-- Google Glass, Voice controls, Wrist devices
-- The user's state of mind: Calm, stressed, etc.
-- Location: Inside, outside, driving, moving

- Examples of apps to learn from
-- Dark design
-- Inpath / Pathapp (for innovative controls)
-- Prezi
-- WGBH
-- Grid on iOS
-- Runkeeper
-- Messaging apps (Apple: context changing UX)
-- Tandem Seven (design firm HQ in Massachusetts)
-- UXArchive.com: Has lots of examples of UX of different apps organized by tasks (screenshots)

- Apps vs. Sites:
-- Sites have a Presentation Layer
-- Apps also have major Control Layers and Input Layers

- Don't forget different expectations in each OS for user interface stye
-- Android vs. iOS vs. ...?

- "Write Once, QA Many"

- UX control techniques
-- Ease of Discoverability vs. Learned / Taught

- Action Items
-- Design Guidelines (we're too soon for standards)
-- Need more good/successful examples of business apps

That's it. One of the attendees told us about how his company had worked quite hard on each variation of their native apps for each platform and the thinking that went into each input control and interaction. They are in an app category (messaging) that is quite competitive and where specifics of the UI is one of the distinguishing features of the product and what makes it worthwhile for users to choose their product instead of just using the built-in app. I hope I can get a podcast with him at some point to hear what they went through.
Published: Fri, 08 Nov 2013 20:04:01 GMT
Updated: Sun, 28 Jan 2018 20:44:51 GMT
The URL to provide to an RSS aggregator when subscribing to this feed: http://danbricklin.com/log_rss.xml
(For more information about RSS see: What is RSS?.)