Month: June 2019

Tiny Robobee X-Wing powers its flight with light

We’ve seen Harvard’s Robobee flying robot evolve for years: After first learning to fly, it learned to swim in 2015, then to jump out of the water again in 2017 — and now it has another trick up its non-existent sleeve. The Robobee X-Wing can fly using only the power it collects from light hitting its solar cells, making it possible to stay in the air indefinitely.

Achieving flight at this scale is extremely hard. You might think that being small, it would be easy to take off and maintain flight, like an insect does. But self-powered flight actually gets much harder the smaller, which puts insects among the most bafflingly marvelous feats of engineering we have encountered in nature.

Oh, it’s easy enough to fly when you have a wire feeding you electricity to power a pair of tiny wings — and that’s how the Robobee and others flied before. It’s only very recently that researchers have accomplished meaningful flight using on-board power or, in one case, a laser zapping an attached solar panel.

robobee chartThe new Robobee X-Wing (named for its 4-wing architecture) achieves a new milestone with the ability to fly with no battery and no laser — only plain full-spectrum light coming from above. Brighter than sunlight, to be fair — but close to real-world conditions.

The team at Harvard’s Microrobotics Laboratory accomplished this by making the power conversion and wing mechanical systems incredibly lightweight — the whole thing weighs about a quarter of a gram, or about half a paper clip. Its power consumption is likewise lilliputian:

Consuming only 110–120 milliwatts of power, the system matches the thrust efficiency of similarly sized insects such as bees. This insect-scale aerial vehicle is the lightest thus far to achieve sustained untethered flight (as opposed to impulsive jumping or liftoff).

That last bit is some shade thrown at its competitors, which by nature can’t quite achieve “sustained untethered flight,” though what constitutes that isn’t exactly clear. After all, this Dutch flapping flyer can go a kilometer on battery power. If that isn’t sustained, I don’t know what is.

In the video of the Robobee you can see that when it is activated, it shoots up like a bottle rocket. One thing they don’t really have space for on the robot’s little body (yet) is sophisticated flight control electronics and power storage that could let it use only the energy it needs, flapping in place.

That’s probably the next step for the team, and it’s a non-trivial one: adding weight and new systems completely changes the device’s flight profile. But give them a few months or a year and this thing will be hovering like a real dragonfly.

The Robobee X-Wing is exhaustively described in a paper published in the journal Nature.

Police body-cam maker Axon says no to facial recognition, for now

Facial recognition is a controversial enough topic without bringing in everyday policing and the body cameras many (but not enough) officers wear these days. But Axon, which makes many of those cameras, solicited advice on the topic from and independent research board, and in accordance with its findings has opted not to use facial recognition for the time being.

The company, formerly known as Taser, established its “AI and Policing Technology Ethics Board” last year, and the group of 11 experts from a variety of fields just issued their first report, largely focused (by their own initiative) on the threat of facial recognition.

The advice they give is unequivocal: don’t use it — now or perhaps ever.

More specifically, their findings are as follows:

  • Facial recognition simply isn’t good enough right now for it to be used ethically.
  • Don’t talk about “accuracy,” talk about specific false negatives and positives, since those are more revealing and relevant.
  • Any facial recognition model that is used shouldn’t be overly customizable, or it will open up the possibility of abuse.
  • Any application of facial recognition should only be initiated with the consent and input of those it will affect.
  • Until there is strong evidence that these programs provide real benefits, there should be no discussion of use.
  • Facial recognition technologies do not exist, nor will they be used, in a political or ethical vacuum, so consider the real world when developing or deploying them.

The full report may be read here; there’s quite a bit of housekeeping and internal business, but the relevant part starts on page 24. Each of the above bullet points gets a couple pages of explanation and examples.

Axon, for its part, writes that it is quite in agreement: “The first board report provides us with thoughtful and actionable recommendations regarding face recognition technology that we, as a company, agree with… Consistent with the board’s recommendation, Axon will not be commercializing face matching products on our body cameras at this time.”

Not that they won’t be looking into it. The idea, I suppose, is that the technology will never be good enough to provide the desired benefits if no one is advancing the science that underpins it. The report doesn’t object except to advise the company that it adhere to the evolving best practices of the AI research community to make sure its work is free from biases and systematic flaws.

One interesting point that isn’t always brought up is the difference between face recognition and face matching. Although the former is the colloquial catch-all term for what we think of as being potentially invasive, biased, and so on, in the terminology here it is different from the latter.

Face recognition, or detection, is just finding the features that make up a face in the picture — this can be used by a smartphone to focus its camera or apply an effect, for instance. Face matching is taking the features of the detected face and comparing it to a database in order to match it to one on file — that could be to unlock your phone using Face ID, but it could also be the FBI comparing everyone entering an airport to the most wanted list.

Axon uses face recognition and tracking to process the many, many hours of video that police departments full of body cams produce. When that video is needed as evidence, faces other than the people directly involved may need to be blurred out, and you can’t do that unless you know where the faces are. (Update: This paragraph originally stated that Axon was using a “lesser form of face matching,” which matches faces within videos but not with any central database, that it calls face re-identification. In fact this technology is not currently deployed commercially and is only in the research phase.)

That particular form of the technology seems benign in its current form, and no doubt there are plenty of other applications that it would be hard to disagree with. But as facial recognition techniques grow more mainstream it will be good to have advisory boards like this one keeping the companies that use them honest.

Startups at the speed of light: Lidar CEOs put their industry in perspective

As autonomous cars and robots loom over the landscapes of cities and jobs alike, the technologies that empower them are forming sub-industries of their own. One of those is lidar, which has become an indispensable tool to autonomy, spawning dozens of companies and attracting hundreds of millions in venture funding.

But like all industries built on top of fast-moving technologies, lidar and the sensing business is by definition built somewhat upon a foundation of shifting sands. New research appears weekly advancing the art, and no less frequently are new partnerships minted, as car manufacturers like Audi and BMW scramble to keep ahead of their peers in the emerging autonomy economy.

To compete in the lidar industry means not just to create and follow through on difficult research and engineering, but to be prepared to react with agility as the market shifts in response to trends, regulations, and disasters.

I talked with several CEOs and investors in the lidar space to find out how the industry is changing, how they plan to compete, and what the next few years have in store.

Their opinions and predictions sometimes synced up and at other times diverged completely. For some, the future lies manifestly in partnerships they have already established and hope to nurture, while others feel that it’s too early for automakers to commit, and they’re stringing startups along one non-exclusive contract at a time.

All agreed that the technology itself is obviously important, but not so important that investors will wait forever for engineers to get it out of the lab.

And while some felt a sensor company has no business building a full-stack autonomy solution, others suggested that’s the only way to attract customers navigating a strange new market.

It’s a flourishing market but one, they all agreed, that will experience a major consolidation in the next year. In short, it’s a wild west of ideas, plentiful money, and a bright future — for some.

The evolution of lidar

I’ve previously written an introduction to lidar, but in short, lidar units project lasers out into the world and measure how they are reflected, producing a 3D picture of the environment around them.

NASA’s Dragonfly will fly across the surface of Titan, Saturn’s ocean moon

NASA has just announced its next big interplanetary mission: Dragonfly, which will deliver a Mars Rover-sized flying vehicle to the surface of Titan, a moon of Saturn with tantalizing life-supporting qualities. The craft will fly from place to place, sampling the delicious organic surface materials and sending high-resolution pictures back to Earth.

Dragonfly will launch in 2026, taking eight years to reach Titan and land (if all goes well) in 2034. So there will be plenty more updates after this one!

The craft will parachute through Titan’s hazy atmosphere and land among its dune-filled equatorial region. It’s equipped with drills and probes to investigate the surface, and of course cameras to capture interesting features and the surrounding alien landscape, flying from place to place using a set of rotors like a drone’s.

We’ve observed Titan from above via the Cassini mission, and we’ve even touched down on its surface briefly with the Huygens probe — which for all we know is still sitting there. But this will be a much more in-depth look at this fascinating moon.

Titan is a weird place. With rivers, oceans, and abundant organic materials on the surface, it’s very like Earth in some ways — but you wouldn’t want to live there. The rivers are liquid methane, for one thing, and if you’re familiar with methane, you’ll know that means it’s really cold there.

dragonfly gifNevertheless, Titan is still an interesting analogue to early Earth.

“We know that Titan has rich organic material, very complex organic material on the surface; there’s energy in the form of sunlight; and we know there’s been water on the surface in the past. These ingredients, that we know are necessary for the development life as we know it are sitting on the surface on Titan,” said principal investigator Elizabeth Turtle. “They’ve been doing chemistry experiments, basically, for hundreds of millions of years, and Dragonfly is designed to go pick up the results of those experiments.”

Don’t expect a flourishing race of methane-dwelling microbes, though. It’s more like going back in time to pre-life Earth to see what conditions may have resulted in the earliest complex self-replicating molecules: the origin of the origin of life, if you will.

dragonfly model

Principal investigator Elizabeth Turtle shows off a 1/4 scale model of the Dragonfly craft.

To do so Dragonfly, true to its name, will be flitting around the surface to collect data from many different locations. It may seem that something the size of a couch may have trouble lifting off, but as Turtle explained, it’s actually a lot easier to fly around Titan than to roll. With a far thicker atmosphere (mostly nitrogen, like ours) and a fraction of Earth’s gravity, it’ll be more like traveling through water than air.

That explains why its rotors are so small — for something that big on Earth, you’d need huge powerful rotors working full time. But even one of these little rotors can shift the craft if necessary (though they’ll want all eight for lift and redundancy).

We’ll learn more soon, no doubt. This is just the opening salvo from NASA on what will surely be years of further highlights, explanations, and updates on Dragonfly’s creation and launch.

“It’s remarkable to think of this rotorcraft flying miles and miles across the organic sand dunes of Saturn’s largest moon, exploring the processes that shape this extraordinary environment,” said NASA associate administrator for science Thomas Zurbuchen. “Titan is unlike any other place in the solar system, and Dragonfly is like no other mission.”

Apple’s Sidecar just really *gets* me, you know?

With the rollout of Apple’s public beta software previews of macOS and the new iPadOS, I’ve finally been able to experience first-hand Sidecar, the feature that lets you use an iPad as an external display for your Mac. This is something I’ve been looking to make work since the day the iPad was released, and it’s finally here – and just about everything you could ask for.

These are beta software products, and I’ve definitely encountered a few bugs including my main Mac display blanking out and requiring a restart (that’s totally fine – betas by definition aren’t fully baked). But Sidecar is already a game-changer, and one that I will probably have a hard time living without in future – especially on the road.

Falling nicely into the ‘it just works’ Apple ethos, setting up Sidecar is incredibly simple. As long as your Mac is running macOS 10.15 Catalina, and your iPad is nearby, with Bluetooth and Wifi enabled, and running the iPadOS 13 beta, you just click on the AirPlay icon in your Mac’s Menu bar and it should show up as a display option.

Once you select your iPad, Sidecar just quickly displays an extended desktop from your Mac on the iOS device. It’s treated as a true external display in macOS System Preferences, so you can arrange it with other displays, mirror your Mac and more. The one thing you can’t do that you can do with traditional displays is change the resolution – Apple keeps things default here at 1366 x 1024, but it’s your iPad’s extremely useful native resolution (2732 x 2048, plus Retina pixel doubling for the first-generation 12.9-inch iPad Pro I’m using for testing), and it means there’s nothing weird going on with pixelated graphics or funky text.

Apple also turns on, by default, both a virtual Touchbar and a new feature called ‘Sidebar’ (yes, it’s a Sidebar for your Sidecar) that provides a number of useful commands including the ability to call up the dock, summon a virtual keyboard, quickly access the command key and more. This is particularly useful if you’re using the iPad on its own without the attached Mac, which can really come in handy when you’re deep in a drawing application and just looking to do quick things like undo, and Apple has a dedicated button in Sidebar for that, too.

sidecar2

The Touchbar is identical to Apple’s hardware Touchbar, which it includes on MacBook Pros, dating back to its introduction in 2016. The Touchbar has always been kind of a ‘meh’ feature, and some critics vocally prefer the entry-level 13-inch MacBook Pro model that does away with it altogether in favor of an actual hardware Escape key. And on the iPad using Sidecar, you also don’t get what might be its best feature – TouchID. But, if you’re using Sidecar specifically for photo or video editing, it’s amazing to be able to have it called up and sitting there ready to do, as an app-specific dedicated quick action toolbar.

Best of all, Apple made it possible to easily turn off both these features, and to do so quickly right from your Mac’s menu bar. That way, you get the full benefit of your big beautiful iPad display. Sidecar will remember this preference too for next time you connect.

Also new to macOS Catalina is a hover-over menu for the default window controls (those three ‘stoplight’ circular buttons that appear at the top left of any Mac app). Apple now provides options to either go fullscreen, tile your app left or right to take up 50% of your display, or, if you’re using Sidecar, to quickly move the app to Sidecar display or back.

[gallery ids="1850048,1850047"]

This quick shuffle action works great, and also respects your existing windows settings, so you can move an app window that you’ve resized manually to take up a quarter of your Mac’s display, and then when you send it back from the Sidecar iPad, it’ll return to where you had it originally in the same size and position. It’s definitely a nice step up in terms of native support for managing windows across multiple displays.

I’ve been using Sidecar wirelessly, though it also works wired and Apple has said there shouldn’t really be any performance disparity regardless of which way you go. So far, the wireless mode has exceeded all expectations, and any third-party competitors in terms of reliability and quality. It also works with the iPad Pro keyboard case, which makes for a fantastic input alternative if you happen to be closer to that one instead of the keyboard you’re using with your Mac.

Sidecar also really shines for digital artists, because it supports input via Apple Pencil immediately in apps that have already built in support for stylus input on Macs, including Adobe Photoshop and Affinity Photo. I’ve previously used a Wacom Cintiq 13HD with my Mac for this kind of thing, and I found Apple’s Sidecar to be an amazing alternative, not least of which because it’s wireless and even the 12.9 iPad Pro is such more portable than the Wacom device. Input seems to have very little response lag (like, it’s not even really perceivable), there’s no calibration required to make sure the Pencil lines up with the cursor on the screen, and as I mentioned above, combined with the Sidebar and dedicated ‘Undo’ button, it’s an artistic productivity machine.

The Pencil is the only means of touch input available with Sidecar, and that’s potentially going to be weird for users of other third-party display extender apps, most of which support full touch input for the extended Mac display they provide. Apple has intentionally left out finger-based touch input, because Mac just wasn’t designed for it, and in use that actually tracks with what my brain expects, so it probably won’t be too disorienting for most users.

When Apple introduced the 5K iMac, it left out one thing that had long been a mainstay of that all-in-on desktop – Target Display Mode. It was a sad day for people who like to maximize the life of their older devices. But they’ve more than made up for it with the introduction of Sidecar, which genuinely doubles the utility value of any modern iPad, provided you’re someone for whom additional screen real estate, with or without pressure-sensitive pen input, is something valuable. As someone who often works on the road and out of the office, Sidecar seems like something I personally designed in the room with Apple’s engineering team.

Crowdfunded spacecraft LightSail 2 prepares to go sailing on sunlight

Among the many spacecraft and satellites ascending to space on Monday’s Falcon Heavy launch, the Planetary Society’s LightSail 2 may be the most interesting. If all goes well, a week from launch it will be moving through space — slowly, but surely — on nothing more than the force exerted on it by sunlight.

LightSail 2 doesn’t have solar-powered engines, or use solar energy or heat for some secondary purpose; it will literally be propelled by the physical force of photons hitting its immense shiny sail. Not solar wind, mind you — that’s a different thing altogether.

It’s an idea, explained Planetary Society CEO and acknowledged Science Guy Bill Nye said in a press call ahead of the launch, that goes back centuries.

“It really goes back to the 1600s,” he said; Kepler deduced that a force from the sun must cause comet tails and other effects, and “he speculated that brave people would one day sail the void.”

So they might, as more recent astronomers and engineers have pondered the possibility more seriously.

“I was introduced to this in the 1970s, in the disco era. I was in Carl Sagan’s astronomy class… wow, 42 years ago, and he talked about solar sailing,” Nye recalled. “I joined the Planetary Society when it was formed in 1980, and we’ve been talking about solar sails around here ever since then. It’s really a romantic notion that has tremendous practical applications; there are just a few missions that solar sails are absolutely ideal for.”

Those would primarily be long-term, medium-orbit missions where a craft needs to stay in an Earth-like orbit, but still get a little distance away from the home planet — or, in the future, long-distance missions where slow and steady acceleration from the sun or a laser would be more practical than another propulsion method.

Mission profile

The eagle-eyed among you may have spotted the “2” in the name of the mission. LightSail 2 is indeed the second of its type; the first launched in 2015, but was not planned to be anything more than a test deployment that would burn up after a week or so.

That mission had some hiccups, with the sail not deploying to its full extent and a computer glitch compromising communications with the craft. It was not meant to fly via solar sailing, and did not.

“We sent the CubeSat up, we checked out the radio, the communications, the overall electronics, and we deployed the sail and we got a picture of that deployed sail in space,” said COO Jennifer Vaughn. “That was purely a deployment test; no solar sailing took place.”

The spacecraft itself, minus the sail, of course.

But it paved the way for its successor, which will attempt this fantastical form of transportation. Other craft have done so, most notably JAXA’s IKAROS mission to Venus, which was quite a bit larger — though as LightSail 2’s creators pointed out, not nearly as efficient as their craft — and had a very different mission.

The brand new spacecraft, loaded into a 3U CubeSat enclosure — that’s about the size of a loaf of bread — is piggybacking on an Air Force payload going up to an altitude of about 720 kilometers. There it will detach and float freely for a week to get away from the rest of the payloads being released.

Once it’s safely on its own, it will fire out from its carrier craft and begin to unfurl the sail. From that loaf-sized package will emerge an expanse of reflective Mylar with an area of 32 square meters — about the size of a boxing ring.

Inside the spacecraft’s body is also what’s called a reaction wheel, which can be spun up or slowed down in order to impart the opposite force on the craft, causing it to change its attitude in space. By this method LightSail 2 will continually orient itself so that the photons striking it propel it in the desired direction, nudging it into the desired orbit.

1 HP (housefly power) engine

The thrust produced, the team explained, is very small — as you might expect. Photons have no mass, but they do (somehow) have momentum. Not a lot, to be sure, but it’s greater than zero, and that’s what counts.

“In terms of the amount of force that solar pressure is going to exert on us, it’s on the micronewton level,” said LightSail project manager Dave Spencer. “It’s very tiny compared to chemical propulsion, very small even compared to electric propulsion. But the key for solar sailing is that it’s always there.”

“I have many numbers that I love,” cut in Nye, and detailed one of them: “It’s nine micronewtons per square meter. So if you have 32 square meters you get about a hundred micronewtons. It doesn’t sound like much, but as Dave points out, it’s continuous. Once a rocket engine stops, when it runs out of fuel, it’s done. But a solar sail gets a continuous push day and night. Wait…” (He then argued with himself about whether it would experience night — it will, as you see in the image below.)

Bruce Betts, chief scientist for LightSail, chimed in as well, to make the numbers a bit more relatable: “The total force on the sail is approximately equal to the weight of a house fly on your hand on Earth.”

Yet if you added another fly every second for hours at a time, pretty soon you’ve got a really considerable amount of acceleration going on. This mission is meant to find out whether we can capture that force.

“We’re very excited about this launch,” said Nye, “because we’re going to get to a high enough altitude to get away from the atmosphere, far enough that we’ll really gonna be able to build orbital energy and take some, I hope, inspiring pictures.”

Second craft, same (mostly) as the last

The LightSail going up this week has some improvements over the last one, though overall it’s largely the same — and a relatively simple, inexpensive craft at that, the team noted. Crowdfunding and donations over the last decade have provided quite a bit of cash to pursue this project, but it still is only a small fraction of what NASA might have spent on a similar mission, Spencer pointed out.

“This mission is going to be much more robust than the previous LightSail 1, but as we said previously, it’s done by a small team,” he said. “We’ve had a very small budget relative to our NASA counterparts, probably 1/20th of the budget that a similar NASA mission would have. It’s a low-cost spacecraft.”

Annotated image of LightSail 2, courtesy of Planetary Society.

But the improvements are specifically meant to address the main problems encountered by LightSail 2’s predecessor.

Firstly, the computer inside has been upgraded to be more robust (though not radiation-hardened) and given the ability to sense faults and reboot if necessary — they won’t have to wait, as they did for LightSail 1, for a random cosmic ray to strike the computer and cause a “natural reboot.” (Yes, really.)

The deployment of the sail itself has also improved. The previous one only extended to about 90% of its full width and couldn’t be adjusted after the fact. Subsequently tests have been done, Betts told me, to exactly determine how many revolutions the motor must make to extend the sail to 100%. Not only that, but they have put markings on the extending booms or rods that will help double check how deployment has gone.

“We also have the capability on orbit, if it looks like it’s not fully extended, we can extend it a little bit more,” he said.

Once it’s all out there, it’s uncharted territory. No one has attempted to do this kind of mission, even IKAROS, which had a totally different flight profile. The team is hoping their sensors and software are up to the task — and it should be clear whether that’s the case within a few hours of unfurling the sail.

It’s still mainly an experiment, of course, and what the team learns from this they will put into any future LightSail mission they attempt, but also share it with the spaceflight community and others attempting to sail on sunlight.

“We all know each other and we all share information,” said Nye. “And it really is — I’ve said it as much as I can — it’s really exciting to be flying this thing at last. It’s almost 2020 and we’ve been talking about it for, well, for 40 years. It’s very, very cool.”

LightSail 2 will launch aboard a SpaceX Falcon Heavy no sooner than June 24th. Keep an eye on the site for the latest news and a link to the live stream when it’s almost time for takeoff.

Roli’s newest instrument, the Lumi, helps you learn to play piano with lights

There has been a longstanding gulf between the consumption of music and the creation of it: not everyone has the time or money to spend on lessons and instruments, and for those in school, many music education programs have been cut back over the years, making the option of learning to play instruments for free less common. Still others have had moments of interest but haven’t found the process of learning that easy.

Now we’re seeing a new wave of startups emerge that are attempting to tackle these issues with technology, creating tools and even new instruments that leverage smartphones and tablets, new hardware computing innovations and new software to make learning music more than just a pastime for a select few.

In the latest development, London startup Roli is launching a new interactive keyboard called the Lumi. Part colourful, sound-sensitive lightboard and part piano, the Lumi’s keys light up in a colorful array to help guide and teach you to play music. The 11-inch keyboard — which can be linked with one or two more of the same to add more octaves — comes with an iPad app that contains hundreds of pieces, and the two are now selling for $249 alongside a new Kickstarter to help drum up interest and offer early-bird discounts. The Kickstarter campaign blew through its modest £100,000 goal within a short while, and some of the smaller tiers of pledges are now sold out. The product will start shipping in October 2019, the company says.

As you might already know, or have guessed by the reaction to the kickstarter, this is not Roli’s first rodeo: the company has made two other major products (and variations on those two) before this, also aimed at music making. First came the Seaboard, which Roli described as a new instrument when it first launched. Taking the form factor of a keyboard, it contained squishy keys that let the player bend notes and create other effects alongside electronic-based percussive tapping, as you would do with a normal keyboard.

Its next product was Blocks: small, modular light boards that also used colored light to guide your playing and help you create new and interesting sounds and beats with taps (and using a similarly squidgy surface to the Seaboard) and then mix them together.

Both of these were interesting, but somewhat aimed at those who were already familiar with playing pianos or other instruments, or with creating and playing electronic music with synthesizers, FX processors and mixers. (Case in point: the people I know who were most interested in these were my DJ friends and my kids, who both play the piano and are a little nerdy about these things.)

The Lumi is in a way a step back for Roli from trying to break new ground by conceiving of completely new instruments, with new form factors built with the benefits of technology and electronics in mind. But it’s also a step ahead: using a keyboard as the basis of the instrument, the Lumi is more familiar and therefore more accessible — with an accessible price of $249 to go along with that.

Lumi’s emergence comes after an interesting few years of growth for Roli. The company is one of the select few (and I think the only one making musical instruments) to be retailed in Apple stores, and it’s had endorsements from some very high-profile people, but that’s about as mainstream as it has been up to now.

The startup’s founder and CEO, American-born Roland Lamb, is probably best described as a polymath, someone who comes across less as a geeky and nervous or (at the other end) ultra-smooth-talking startup founder, and more like a calm-voiced thinker who has come out to talk to you in a break between reading and writing about the nature of music and teaching a small philosophy seminar.

His background also speaks to this unconventional manner. Before coming to found Roli, he lived in a Zen monastery, made his way around the world playing jazz piano, and studied Chinese and Sanskrit at Harvard and design at the Royal College of Art.

Roli has always been a little cagey about how much it has raised and from whom, but the list includes consumer electronics giants like Sony, specialist audio makers like Onkyo, the music giant Universal Music Group and VCs that include Founders Fund, Index and LocalGlobe, Kreos Capital, Horizons Ventures and more. It’s also partnered with a number of big names like Pharrell Williams (who is also an investor) in the effort to get its name out.

And while it has most definitely made a mark with a certain echelon of the music world — producers and those creating electronic music — it has not parlayed that into a wider global reputation or wider accessibility. After bringing out instruments more for a high-end audience, the Lumi seems like an attempt to do just that.

That seems to be coming at the right time. Services like Spotify and YouTube — and the rise of phones and internet usage in general — have transformed how we listen to music. We now have a much wider array of things to listen to whenever we want. On top of that, services like YouTube and SoundCloud furthermore are giving us a taste of creating our own music: using electronic devices, we can go beyond what might have been limitations up to now (for example, having never learned to play an instrument in the traditional sense) to get stuck into the craft itself.

The Lumi is also tapping into another important theme, and that is of music being “good for you.” There is a line of thought that says learning an instrument is good for your mind, both if you’re a younger person who is still in school or indeed out of school and looking to stay sharp. Others believe it has health benefits.

But realistically, these beliefs don’t get applied very often. Roli cites stats that say that only 10% of adults aged 18-29 have played an instrument in the past year, and of those that played as children, some 80% say they quit by age 14.

Putting this together with the Lumi, it seems that the aim is to hit a wider swathe of the market and bring in people who might want to learn something like playing an instrument but previously thought it would be too much of a challenge.

Roli isn’t the first — nor likely the last — company to reconsider how to learn playing the piano through technology. The Chinese company ONE Music Group makes both smart pianos with keyboards that light up, as well as a strip that you overlay on any keyboard, that also corresponds to an iPad app to learn to play piano.

An American startup called McCarthy Music also makes illuminated-key pianos, also subscribing to the principle that providing this kind of guidance to teach muscle memory is an important step in getting a student acquainted with playing on a keyboard.

The Lumi is notable not just because of its cost, but its size — the single, lightweight keyboards have a battery life of six hours and can fit in a backback.

That said, Roli is hoping there will be a double audience to these in the longer term, bridging the divide between music maker and listener, but also amateur and pro.

“Many people would love to play an instrument but worry that they don’t have the talent. Through our research, design, and innovation at ROLI, we’ve come to believe that the problem is not a lack of talent. Rather, instruments themselves are not smart enough,” said Lamb in a statement. “What excites me most is that the intelligence of LUMI means that there’s something in it for everyone. On one hand my own kids now prefer LUMI time to movie time. On the other hand, several of the world’s leading keyboard players can’t wait to use LUMI in the studio and on the stage.”