Samsung’s new phone topped 1Gbps download speeds during this week’s tests, but the network still has a long way to go.
Samsung’s new phone topped 1Gbps download speeds during this week’s tests, but the network still has a long way to go.
The weekend is the best for relaxing activities, including binge-watching some Amazon Prime flicks. We’ve rounded up the best movies, TV shows, and documentaries that you can stream on Amazon Prime from May 17 to May […]
When your game tops 100 million players, your thoughts naturally turn to doubling that number. That’s the case with the creators, or rather stewards, of Minecraft at Microsoft, where the game has become a product category unto itself. And now it is making its biggest leap yet — to a real-world augmented reality game in the vein of Pokémon GO, called Minecraft Earth.
Announced today but not playable until summer (on iOS and Android) or later, MCE (as I’ll call it) is full-on Minecraft, reimagined to be mobile and AR-first. So what is it? As executive producer Jesse Merriam put it succinctly: “Everywhere you go, you see Minecraft. And everywhere you go, you can play Minecraft.”
Yes, yes — but what is it? Less succinctly put, MCE is like other real-world-based AR games in that it lets you travel around a virtual version of your area, collecting items and participating in mini-games. Where it’s unlike other such games is that it’s built on top of Minecraft: Bedrock Edition, meaning it’s not some offshoot or mobile cash-in; this is straight-up Minecraft, with all the blocks, monsters and redstone switches you desire, but in AR format. You collect stuff so you can build with it and share your tiny, blocky worlds with friends.
That introduces some fun opportunities and a few non-trivial limitations. Let’s run down what MCE looks like — verbally, at least, as Microsoft is being exceedingly stingy with real in-game assets.
Because it’s Minecraft Earth, you’ll inhabit a special Minecraftified version of the real world, just as Pokémon GO and Harry Potter: Wizards Unite put a layer atop existing streets and landmarks.
The look is blocky to be sure, but not so far off the normal look that you won’t recognize it. It uses OpenStreetMaps data, including annotated and inferred information about districts, private property, safe and unsafe places and so on — which will be important later.
The fantasy map is filled with things to tap on, unsurprisingly called tappables. These can be a number of things: resources in the form of treasure chests, mobs and adventures.
Chests are filled with blocks, naturally, adding to your reserves of cobblestone, brick and so on, all the different varieties appearing with appropriate rarity.
Mobs are animals like those you might normally run across in the Minecraft wilderness: pigs, chickens, squid and so on. You snag them like items, and they too have rarities, and not just cosmetic ones. The team highlighted a favorite of theirs, the muddy pig, which when placed down will stop at nothing to get to mud and never wants to leave, or a cave chicken that lays mushrooms instead of eggs. Yes, you can breed them.
Last are adventures, which are tiny AR instances that let you collect a resource, fight some monsters and so on. For example you might find a crack in the ground that, when mined, vomits forth a volume of lava you’ll have to get away from, and then inside the resulting cave are some skeletons guarding a treasure chest. The team said they’re designing a huge number of these encounters.
Importantly, all these things — chests, mobs and encounters — are shared between friends. If I see a chest, you see a chest — and the chest will have the same items. And in an AR encounter, all nearby players are brought in, and can contribute and collect the reward in shared fashion.
And it’s in these AR experiences and the “build plates” you’re doing it all for that the game really shines.
“If you want to play Minecraft Earth without AR, you have to turn it off,” said Torfi Olafsson, the game’s director. This is not AR-optional, as with Niantic’s games. This is AR-native, and for good and ill the only way you can really play is by using your phone as a window into another world. Fortunately it works really well.
First, though, let me explain the whole build plate thing. You may have been wondering how these collectibles and mini-games amount to Minecraft. They don’t — they’re just the raw materials for it.
Whenever you feel like it, you can bring out what the team calls a build plate, which is a special item, a flat square that you virtually put down somewhere in the real world — on a surface like the table or floor, for instance — and it transforms into a small, but totally functional, Minecraft world.
In this little world you can build whatever you want, or dig into the ground, build an inverted palace for your cave chickens or create a paradise for your mud-loving pigs — whatever you want. Like Minecraft itself, each build plate is completely open-ended. Well, perhaps that’s the wrong phrase — they’re actually quite closely bounded, as the world only exists out to the edge of the plate. But they’re certainly yours to play with however you want.
Notably all the usual Minecraft rules are present — this isn’t Minecraft Lite, just a small game world. Water and lava flow how they should, blocks have all the qualities they should and mobs all act as they normally would.
The magic part comes when you find that you can instantly convert your build plate from miniature to life-size. Now the castle you’ve been building on the table is three stories tall in the park. Your pigs regard you silently as you walk through the halls and admire the care and attention to detail with which you no doubt assembled them. It really is a trip.
In the demo, I played with a few other members of the press; we got to experience a couple of build plates and adventures at life-size (technically actually 3/4 life size — the 1 block to 1 meter scale turned out to be a little daunting in testing). It was absolute chaos, really, everyone placing blocks and destroying them and flooding the area and putting down chickens. But it totally worked.
The system uses Microsoft’s new Azure Spatial Anchor system, which quickly and continuously fixed our locations in virtual space. It updated remarkably quickly, with no lag, showing the location and orientation of the other players in real time. Meanwhile the game world itself was rock-solid in space, smooth to enter and explore, and rarely bugging out (and that only in understandable circumstances). That’s great news considering how heavily the game leans on the multiplayer experience.
The team said they’d tested up to 10 players at once in an AR instance, and while there’s technically no limit, there’s sort of a physical limit in how many people can fit in the small space allocated to an adventure or around a tabletop. Don’t expect any giant 64-player raids, but do expect to take down hordes of spiders with three or four friends.
In choosing to make the game the way they’ve made it, the team naturally created certain limitations and risks. You Wouldn’t want, for example, an adventure icon to pop up in the middle of the highway.
For exactly that reason the team spent a lot of work making the map metadata extremely robust. Adventures won’t spawn in areas like private residences or yards, though of course simple collectibles might. But because you’re able to reach things up to 70 meters away, it’s unlikely you’ll have to knock on someone’s door and say there’s a cave chicken in their pool and you’d like to touch it, please.
Furthermore adventures will not spawn in areas like streets or difficult to reach areas. The team said they worked very hard making it possible for the engine to recognize places that are not only publicly accessible, but safe and easy to access. Think sidewalks and parks.
Another limitation is that, as an AR game, you move around the real world. But in Minecraft, verticality is an important part of the gameplay. Unfortunately, the simple truth is that in the real world you can’t climb virtual stairs or descend into a virtual cave. You as a player exist on a 2D plane, and can interact with but not visit places above and below that plane. (An exception of course is on a build plate, where in miniature you can fly around it freely by moving your phone.)
That’s a shame for people who can’t move around easily, though you can pick up and rotate the build plate to access different sides. Weapons and tools also have infinite range, eliminating a potential barrier to fun and accessibility.
In Pokémon GO, there’s the drive to catch ’em all. In Wizards Unite, you’ll want to advance the story and your skills. What’s the draw with Minecraft Earth? Well, what’s the draw in Minecraft? You can build stuff. And now you can build stuff in AR on your phone.
The game isn’t narrative-driven, and although there is some (unspecified) character progression, for the most part the focus is on just having fun doing and making stuff in Minecraft. Like a set of LEGO blocks, a build plate and your persistent inventory simply make for a lively sandbox.
Admittedly that doesn’t sound like it carries the same addictive draw of Pokémon, but the truth is Minecraft kind of breaks the rules like that. Millions of people play this game all the time just to make stuff and show that stuff to other people. Although you’ll be limited in how you can share to start, there will surely be ways to explore popular builds in the future.
And how will it make money? The team basically punted on that question — they’re fortunately in a position where they don’t have to worry about that yet. Minecraft is one of the biggest games of all time and a big money-maker — it’s probably worth the cost just to keep people engaged with the world and community.
MCE seems to me like a delightful thing, but one that must be appreciated on its own merits. A lack of screenshots and gameplay video isn’t doing a lot to help you here, I admit. Trust me when I say it looks great, plays well and seems fundamentally like a good time for all ages.
A few other stray facts I picked up:
Sound fun? Sign up for the beta here.
Children with vision impairments struggle to get a solid K-12 education for a lot of reasons — so the more tools their teachers have to impart basic skills and concepts, the better. ObjectiveEd is a startup that aims to empower teachers and kids with a suite of learning games accessible to all vision levels, along with tools to track and promote progress.
Some of the reasons why vision-impaired kids don’t get the education they deserve are obvious, for example that reading and writing are slower and more difficult for them than for sighted kids. But other reasons are less obvious, for example that teachers have limited time and resources to dedicate to these special needs students when their overcrowded classrooms are already demanding more than they can provide.
Technology isn’t the solution, but it has to be part of the solution, because technology is so empowering and kids take to it naturally. There’s no reason a blind 8-year-old can’t also be a digital native like her peers, and that presents an opportunity for teachers and parents both.
This opportunity is being pursued by Marty Schultz, who has spent the last few years as head of a company that makes games targeted at the visually impaired audience, and in the process saw the potential for adapting that work for more directly educational purposes.
It’s hard to argue with that. True of many adults too, for that matter. But as Schultz points out, this is something educators have realized in recent years and turned to everyone’s benefit.
“Almost all regular education teachers use educational digital games in their classrooms and about 20% use it every day,” he explained. “Most teachers report an increase in student engagement when using educational video games. Gamification works because students own their learning. They have the freedom to fail, and try again, until they succeed. By doing this, students discover intrinsic motivation and learn without realizing it.”
Having learned to type, point and click, do geometry and identify countries via games, I’m a product of this same process, and many of you likely are as well. It’s a great way for kids to teach themselves. But how many of those games would be playable by a kid with vision impairment or blindness? Practically none.
It turns out that these kids, like others with disabilities, are frequently left behind as the rising technology tide lifts everyone else’s boats. The fact is it’s difficult and time-consuming to create accessible games that target things like Braille literacy and blind navigation of rooms and streets, so developers haven’t been able to do so profitably and teachers are left to themselves to figure out how to jury-rig existing resources or, more likely, fall back on tried and true methods like printed worksheets, in-person instruction and spoken testing.
And because teacher time is limited and instructors trained in vision-impaired learning are thin on the ground, these outdated methods are also difficult to cater to an individual student’s needs. For example a kid may be great at math but lack directionality skills. You need to draw up an “individual education plan” (IEP) explaining (among other things) this and what steps need to be taken to improve, then track those improvements. It’s time-consuming and hard! The idea behind ObjectiveEd is to create both games that teach these basic skills and a platform to track and document progress as well as adjust the lessons to the individual.[gallery ids="1828695,1828696,1828698,1828697"]
How this might work can be seen in a game like Barnyard, which like all of ObjectiveEd’s games has been designed to be playable by blind, low-vision or fully sighted kids. The game has the student finding an animal in a big pen, then dragging it in a specified direction. The easiest levels might be left and right, then move on to cardinal directions, then up to clock directions or even degrees.
“If the IEP objective is ‘Child will understand left versus right and succeed at performing this task 90% of the time,’ the teacher will first introduce these concepts and work with the child during their weekly session,” Schultz said. That’s the kind of hands-on instruction they already get. “The child plays Barnyard in school and at home, swiping left and right, winning points and getting encouragement, all week long. The dashboard shows how much time each child is playing, how often, and their level of success.”
That’s great for documentation for the mandated IEP paperwork, and difficulty can be changed on the fly as well:
“The teacher can set the game to get harder or faster automatically, or move onto the next level of complexity automatically (such as never repeating the prompt when the child hesitates). Or the teacher can maintain the child at the current level and advance the child when she thinks it’s appropriate.”
This isn’t meant to be a full-on K-12 education in a tablet app. But it helps close the gap between kids who can play Mavis Beacon or whatever on school computers and vision-impaired kids who can’t.
Importantly, the platform is not being developed without expert help — or, as is actually very important, without a business plan.
“We’ve developed relationships with several schools for the blind as well as leaders in the community to build educational games that tackle important skills,” Schultz said. “We work with both university researchers and experienced Teachers of Visually Impaired students, and Certified Orientation and Mobility specialists. We were surprised at how many different skills and curriculum subjects that teachers really need.”
Based on their suggestions, for instance, the company has built two games to teach iPhone gestures and the accessibility VoiceOver rotor. This may be a proprietary technology from Apple, but it’s something these kids need to know how to use, just like they need to know how to run a Google search, use a mouse without being able to see the screen, and other common computing tasks. Why not learn it in a game like the other stuff?
Making technological advances is all well and good, but doing so while building a sustainable business is another thing many education startups have failed to address. Fortunately, public school systems actually have significant money set aside specifically for students with special needs, and products that improve education outcomes are actively sought and paid for. These state and federal funds can’t be siphoned off to use on the rest of the class, so if there’s nothing to spend them on, they go unused.
ObjectiveEd has the benefit of being easily deployed without much specialty hardware or software. It runs on iPads, which are fairly common in schools and homes, and the dashboard is a simple web one. Although it may eventually interface with specialty hardware like Braille readers, it’s not necessary for many of the games and lessons, so that lowers the deployment bar as well.
The plan for now is to finalize and test the interface and build out the games library — ObjectiveEd isn’t quite ready to launch, but it’s important to build it with constant feedback from students, teachers and experts. With luck, in a year or two the visually-impaired youngsters at a school near you might have a fun new platform to learn and play with.
“ObjectiveEd exists to help teachers, parents and schools adapt to this new era of gamified learning for students with disabilities, starting with blind and visually impaired students,” Schultz said. “We firmly believe that well-designed software combined with ‘off-the-shelf’ technology makes all this possible. The low cost of technology has truly revolutionized the possibilities for improving education.”
After evacuating a university library due to a suspecting “gas leak,” Australian school officials discovered that the gross smell was actually coming from a smelly durian fruit. On May 9, the University of […]
The post University Library Evacuates After ‘Gas Leak’ Turns Out to Be Stinky Fruit appeared first on Geek.com.
A development lab used by Samsung engineers was leaking highly sensitive source code, credentials and secret keys for several internal projects — including its SmartThings platform, a security researcher found.
The electronics giant left dozens of internal coding projects on a GitLab instance hosted on a Samsung-owned domain, Vandev Lab. The instance, used by staff to share and contribute code to various Samsung apps, services and projects, was spilling data because the projects were set to “public” and not properly protected with a password, allowing anyone to look inside at each project, access and download the source code.
Mossab Hussein, a security researcher at Dubai-based cybersecurity firm SpiderSilk who discovered the exposed files, said one project contained credentials that allowed access to the entire AWS account that was being used, including more than 100 S3 storage buckets that contained logs and analytics data.
Many of the folders, he said, contained logs and analytics data for Samsung’s SmartThings and Bixby services, but also several employees’ exposed private GitLab tokens stored in plaintext, which allowed him to gain additional access from 42 public projects to 135 projects, including many private projects.
Samsung told him some of the files were for testing but Hussein challenged the claim, saying source code found in the GitLab repository contained the same code as the Android app, published in Google Play on April 10.
The app, which has since been updated, has more than 100 million installs to date.
“I had the private token of a user who had full access to all 135 projects on that GitLab,” he said, which could have allowed him to make code changes using a staffer’s own account.
Hussein shared several screenshots and a video of his findings for TechCrunch to examine and verify.
The exposed GitLab instance also contained private certificates for Samsung’s SmartThings’ iOS and Android apps.
Hussein also found several internal documents and slideshows among the exposed files.
“The real threat lies in the possibility of someone acquiring this level of access to the application source code, and injecting it with malicious code without the company knowing,” he said.
Through exposed private keys and tokens, Hussein documented a vast amount of access that if obtained by a malicious actor could have been “disastrous,” he said.
Hussein, a white-hat hacker and data breach discoverer, reported the findings to Samsung on April 10. In the days following, Samsung began revoking the AWS credentials, but it’s not known if the remaining secret keys and certificates were revoked.
Samsung still hasn’t closed the case on Hussein’s vulnerability report, close to a month after he first disclosed the issue.
“Recently, an individual security researcher reported a vulnerability through our security rewards program regarding one of our testing platforms,” Samsung spokesperson Zach Dugan told TechCrunch when reached prior to publication. “We quickly revoked all keys and certificates for the reported testing platform and while we have yet to find evidence that any external access occurred, we are currently investigating this further.”
Hussein said Samsung took until April 30 to revoke the GitLab private keys. Samsung also declined to answer specific questions we had and provided no evidence that the Samsung-owned development environment was for testing.
Hussein is no stranger to reporting security vulnerabilities. He recently disclosed a vulnerable back-end database at Blind, an anonymous social networking site popular among Silicon Valley employees — and found a server leaking a rolling list of user passwords for scientific journal giant Elsevier.
Samsung’s data leak, he said, was his biggest find to date.
“I haven’t seen a company this big handle their infrastructure using weird practices like that,” he said.
If you’ve flirted with the idea of buying a robot vacuum you may also have stepped back from the brink in unfolding horror at the alphabetic soup of branded discs popping into view. Consumer choice sounds like a great idea until you’ve tried to get a handle on the handle-less vacuum space.
Amazon offers an A to Z linklist of “top brands” that’s only a handful of letters short of a full alphabetic set. The horror.
What awaits the unseasoned robot vacuum buyer as they resign themselves to hours of online research to try to inform — or, well, form — a purchase decision is a seeming endless permutation of robot vac reviews and round-ups.
Unfortunately there are just so many brands in play that all these reviews tend to act as fuel, feeding a growing black hole of indecision that sucks away at your precious spare time, demanding you spend more and more of it reading about robots that suck (when you could, let’s be frank, be getting on with the vacuuming task yourself) — only to come up for air each time even less convinced that buying a robot dirtbag is at all a good idea.
Reader, I know, because I fell into this hole. And it was hellish. So in the spirit of trying to prevent anyone else falling prey to convenience-based indecision I am — apologies in advance — adding to the pile of existing literature about robot vacuums with a short comparative account that (hopefully) helps cut through some of the chaff to the dirt-pulling chase.
Here’s the bottom line: Budget robot vacuums that lack navigational smarts are simply not worth your money, or indeed your time.
Yes, that’s despite the fact they are still actually expensive vacuum cleaners.
Basically these models entail overpaying for a vacuum cleaner that’s so poor you’ll still have to do most of the job yourself (i.e. with a non-robotic vacuum cleaner).
It’s the very worst kind of badly applied robotics.
Abandon hope of getting anything worth your money at the bottom end of the heap. I know this because, alas, I tried — opting, finally and foolishly (but, in my defence, at a point of near desperation after sifting so much virtual chaff the whole enterprise seemed to have gained lottery odds of success and I frankly just wanted my spare time back), for a model sold by a well-known local retailer.
It was a budget option but I assumed — or, well, hoped — the retailer had done its homework and picked a better-than-average choice. Or at least something that, y’know, could suck dust.
The brand in question (Rowenta) sat alongside the better known (and a bit more expensive) iRobot on the shop shelf. Surely that must count for something? I imagined wildly. Reader, that logic is a trap.
I can’t comment on the comparative performance of iRobot’s bots, which I have not personally tested, but I do not hesitate to compare a €180 (~$200) Rowenta-branded robot vacuum to a very expensive cat toy.
This robot vacuum was spectacularly successful at entertaining the cat — presumably on account of its dumb disposition, bouncing stupidly off of furniture owing to a total lack of navigational smarts. (Headbutting is a pretty big clue to how stupid a robot it is, as it’s never a stand-in for intelligence even when encountered in human form.)
Even more tantalizingly, from the cat’s point of view, the bot featured two white and whisker-like side brushes that protrude and spin at paw-tempting distance. In short: Pure robotic catnip.
The cat did not stop attacking the bot’s whiskers the whole time it was in operation. That certainly added to the obstacles getting in its way. But the more existential problem was it wasn’t sucking very much at all.
At the end of its first concluded ‘clean’, after it somehow managed to lurch its way back to first bump and finally hump its charging hub, I extracted the bin and had to laugh at the modest sized furball within. I’ve found larger clumps of dust gathering themselves in corners. So: Full marks for cat-based entertainment but as a vacuum cleaner it was horrible.
At this point I did what every sensible customer does when confronted with an abject lemon: Returned it for a full refund. And that, reader, might have been that for me and the cat and robot vacs. Who can be bothered to waste so much money and time for what appeared laughably incremental convenience? Even with a steady supply of cat fur to contend with.
But as luck would have it a Roborock representative emailed to ask if I would like to review their latest top-of-the-range model — which, at €549, does clock in at the opposite end of the price scale; ~3x the pitiful Rowenta. So of course I jumped at the chance to give the category a second spin — to see if a smarter device could impress me and not just tickle the cat’s fancy.
Clearly the price difference here, at the top vs the bottom of the range, is substantial. And yet, if you bought a car that was 3x times cheaper than a Ferrari you’d still expect not just that the wheels stay on but that it can actually get you somewhere, in good time and do so without making you horribly car sick.
Turns out buyers of robot vacuums need to tread far more carefully.
Here comes the bookending top-line conclusion: Robot vacuums are amazing. A modern convenience marvel. But — and it’s a big one — only if you’re willing to shell out serious cash to get a device that actually does the job intended.
Comparing the Roborock S6 and the Rowenta Smart Force Essential Aqua RR6971WH (to give it its full and equally terrible name) is like comparing a high-end electric car with a wind-up kid’s toy.
Where the latter product was so penny-pinching the company hadn’t even paid to include in the box a user manual that contained actual words — opting, we must assume, to save on translation costs by producing a comic packed with inscrutable graphics and bizarro don’t do diagrams which only served to cement the fast-cooling buyer’s conviction they’d been sold a total lemon — the Roborock’s box contains a well written paper manual that contains words and clearly labeled diagrams. What a luxury!
At the same time there’s not really that much you need to grok to get your head around operating the Roborock. After a first pass to familiarize yourself with its various functions it’s delightfully easy to use. It will even produce periodic vocal updates — such as telling you it’s done cleaning and is going back to base. (Presumably in case you start to worry it’s gone astray under the bed. Or that quiet industry is a front for brewing robotic rebellion against indentured human servitude.)
One button starts a full clean — and this does mean full thanks to on-board laser navigation that allows the bot to map the rooms in real-time. This means you get methodical passes, minimal headbutting and only occasional spots missed. (Another button will do a spot clean if the S6 does miss something or there’s a fresh spill that needs tidying — you just lift the bot to where you want it and hit the appropriate spot.)
There is an app too, if you want to access extra features like being able to tell it to go clean a specific room, schedule cleans or set no-go zones. But, equally delightfully, there’s no absolute need to hook the bot to your wi-fi just to get it to do its primary job. All core features work without the faff of having to connect it to the Internet — nor indeed the worry of who might get access to your room-mapping data. From a privacy point of view this wi-fi-less app-free operation is a major plus.
In a small apartment with hard flooring the only necessary prep is a quick check to clear stuff like charging cables and stray socks off the floor. You can of course park dining chairs on the table to offer the bot a cleaner sweep. Though I found the navigation pretty adept at circling chair legs. Sadly the unit is a little too tall to make it under the sofa.
The S6 includes an integrated mopping function, which works incredibly well on lino-style hard flooring (but won’t be any use if you only have carpets). To mop you fill the water tank attachment; velcro-fix a dampened mop cloth to the bottom; and slide-clip the whole unit under the bot’s rear. Then you hit the go button and it’ll vacuum and mop in the same pass.
In my small apartment the S6 had no trouble doing a full floor clean in under an hour, without needing to return to base to recharge in the middle. (Roborock says the S6 will drive for up to three hours on a single charge.)
It also did not seem to get confused by relatively dark flooring in my apartment — which some reviews had suggested can cause headaches for robot vacuums by confusing their cliff sensors.
After that first clean I popped the lid to check on the contents of the S6’s transparent lint bin — finding an impressive quantity of dusty fuzz neatly wadded therein. This was really just robot vacuum porn, though; the gleaming floors spoke for themselves on the quality of the clean.
The level of dust gobbled by the S6 vs the Rowenta underlines the quality difference between the bottom and top end of the robot vacuum category.
So where the latter’s plastic carapace immediately became a magnet for all the room dust it had kicked up but spectacularly failed to suck, the S6’s gleaming white shell has stayed remarkably lint-free, acquiring only a minimal smattering of cat hairs over several days of operation — while the floors it’s worked have been left visibly dust- and fur-free. (At least until the cat got to work dirtying them again.)
Higher suction power, better brushes and a higher quality integrated filter appear to make all the difference. The S6 also does a much better cleaning job a lot more quietly. Roborock claims it’s 50% quieter than the prior model (the S5) and touts it as its quietest robot vacuum yet.
It’s not super silent but is quiet enough when cleaning hard floors not to cause a major disturbance if you’re working or watching something in the same room. Though the novelty can certainly be distracting.
Even the look of the S6 exudes robotic smarts — with its raised laser-housing bump resembling a glowing orange cylonic eye-slot.
Although I was surprised, at first glance, by the single, rather feeble looking side brush vs the firm pair the Rowenta had fixed to its undercarriage. But again the S6’s tool is smartly applied — stepping up and down speed depending on what the bot’s tackling. I found it could miss the odd bit of lint or debris such as cat litter but when it did these specs stood out as the exception on an otherwise clean floor.
It’s also true that the cat did stick its paw in again to try attacking the S6’s single spinning brush. But these attacks were fewer and a lot less fervent than vs the Rowenta, as if the bot’s more deliberate navigation commanded greater respect and/or a more considered ambush. So it appears that even to a feline eye the premium S6 looks a lot less like a dumb toy.
On a practical front, the S6’s lint bin has a capacity of 480ml. Roborock suggests cleaning it out weekly (assuming you’re using the bot every week), as well as washing the integrated dust filter (it supplies a spare in the box so you can switch one out to clean it and have enough time for it to fully dry before rotating it back into use).
If you use the mopping function the supplied reusable mop cloths do need washing afterwards too (Roborock also includes a few disposable alternatives in the box but that seems a pretty wasteful option when it’s easy enough to stick a reusable cloth in with a load of laundry or give it a quick wash yourself). So if you’re chasing a fully automated, robot-powered, end-to-cleaning-chores dream be warned there’s still a little human elbow grease required to keep everything running smoothly.
Still, there’s no doubt a top-of-the-range robot vacuum like the S6 will save you time cleaning.
If you can justify the not inconsiderable cost involved in buying this extra time by shelling out for a premium robot vacuum that’s smart enough to clean effectively all that’s left to figure out is how to spend your time windfall wisely — resisting the temptation to just put your feet up and watch the clever little robot at work.
The article These Geek-Approved Notebooks Can Help You Organize Your Life appeared first on Geek.com.
Get your day and stay organized with a single — or more — of those journals and notebooks. Save up off the MSRP now […]
Microsoft’s yearly Imagine Cup student startup competition crowned its latest winner today: EasyGlucose, a non-invasive, smartphone-based method for diabetics to test their blood glucose. It and the two other similarly beneficial finalists presented today at Microsoft’s Build developer conference.
The Imagine Cup brings together winners of many local student competitions around the world, with a focus on social good and, of course, Microsoft services like Azure. Last year’s winner was a smart prosthetic forearm that uses a camera in the palm to identify the object it is meant to grasp. (They were on hand today as well, with an improved prototype.)
The three finalists hailed from the U.K., India and the U.S.; EasyGlucose was a one-person team from my alma mater UCLA.
EasyGlucose takes advantage of machine learning’s knack for spotting the signal in noisy data, in this case the tiny details of the eye’s iris. It turns out, as creator Bryan Chiang explained in his presentation, that the iris’s “ridges, crypts and furrows” hide tiny hints as to their owner’s blood glucose levels.
These features aren’t the kind of thing you can see with the naked eye (or rather, on the naked eye), but by clipping a macro lens onto a smartphone camera, Chiang was able to get a clear enough image that his computer vision algorithms were able to analyze them.
The resulting blood glucose measurement is significantly better than any non-invasive measure and more than good enough to serve in place of the most common method used by diabetics: stabbing themselves with a needle every couple of hours. Currently EasyGlucose gets within 7% of the pinprick method, well above what’s needed for “clinical accuracy,” and Chiang is working on closing that gap. No doubt this innovation will be welcomed warmly by the community, as well as the low cost: $10 for the lens adapter, and $20 per month for continued support via the app.
It’s not a home run, or not just yet: Naturally, a technology like this can’t go straight from the lab (or in this case, the dorm) to global deployment. It needs FDA approval first, though it likely won’t have as protracted a review period as, say, a new cancer treatment or surgical device. In the meantime, EasyGlucose has a patent pending, so no one can eat its lunch while it navigates the red tape.
As the winner, Chiang gets $100,000, plus $50,000 in Azure credit, plus the coveted one-on-one mentoring session with Microsoft CEO Satya Nadella.
The other two Imagine Cup finalists also used computer vision (among other things) in service of social good.
Caeli is taking on the issue of air pollution by producing custom high-performance air filter masks intended for people with chronic respiratory conditions who have to live in polluted areas. This is a serious problem in many places that cheap or off-the-shelf filters can’t really solve.
It uses your phone’s front-facing camera to scan your face and pick the mask shape that makes the best seal against your face. What’s the point of a high-tech filter if the unwanted particles just creep in the sides?
Part of the mask is a custom-designed compact nebulizer for anyone who needs medication delivered in mist form, for example someone with asthma. The medicine is delivered automatically according to the dosage and schedule set in the app — which also tracks pollution levels in the area so the user can avoid hot zones.
Finderr is an interesting solution to the problem of visually impaired people being unable to find items they’ve left around their home. By using a custom camera and computer vision algorithm, the service watches the home and tracks the placement of everyday items: keys, bags, groceries and so on. Just don’t lose your phone, as you’ll need that to find the other stuff.
You call up the app and tell it (by speaking) what you’re looking for, then the phone’s camera determines your location relative to the item you’re looking for, giving you audio feedback that guides you to it in a sort of “getting warmer” style, and a big visual indicator for those who can see it.
After their presentations, I asked the creators a few questions about upcoming challenges, since as is usual in the Imagine Cup, these companies are extremely early-stage.
Right now EasyGlucose is working well, but Chiang emphasized that the model still needs lots more data and testing across multiple demographics. It’s trained on 15,000 eye images but many more will be necessary to get the kind of data they’ll need to present to the FDA.
Finderr recognizes all the images in the widely used ImageNet database, but the team’s Ferdinand Loesch pointed out that others can be added very easily with 100 images to train with. As for the upfront cost, the U.K. offers a £500 grant to visually-impaired people for this sort of thing, and they engineered the 360-degree ceiling-mounted camera to minimize the number needed to cover the home.
Caeli noted that the nebulizer, which really is a medical device in its own right, is capable of being sold and promoted on its own, perhaps licensed to medical device manufacturers. There are other smart masks coming out, but he had a pretty low opinion of them (not strange in a competitor, but there isn’t some big market leader they need to dethrone). He also pointed out that in the target market of India (from which they plan to expand later) it isn’t as difficult to get insurance to cover this kind of thing.
While these are early-stage companies, they aren’t hobbies — though, admittedly, many of their founders are working on them between classes. I wouldn’t be surprised to hear more about them and others from Imagine Cup pulling in funding and hiring in the next year.
A drone sighting caused all flights to be suspended at Frankfurt Airport for around an hour this morning. The airport is Germany’s busiest by passenger numbers, serving almost 14.8 million passengers in the first three months of this year.
In a tweet sent after flights had resumed the airport reported that operations were suspended at 07:27, before the suspension was lifted at 08:15, with flights resuming at 08:18.
It added that security authorities were investigating the incident.
A report in local press suggests more than 100 takeoffs and landings were cancelled as a result of the disruption caused by the drone sighting.
It’s the second such incident at the airport after a drone sighting at the end of March also caused flights to be suspended for around half an hour.
Drone sightings near airports have been on the increase for years as drones have landed in the market at increasingly affordable prices, as have reports of drone near misses with aircraft.
The Frankfurt suspension follows far more major disruption caused by repeat drone sightings at the UK’s second largest airport, Gatwick Airport, late last year — which caused a series of flight shutdowns and travel misery for hundreds of thousands of people right before the holiday period.
The UK government came in for trenchant criticism immediately afterwards, with experts saying it had failed to listen and warnings about the risks posed by drone misuse. A planned drone bill has also been long delayed, meaning new legislation to comprehensively regulate drones has slipped.
In response to the Gatwick debacle the UK government quickly pushed through an expansion of existing drone no-fly zones around airports after criticism by aviation experts — beefing up the existing 1km exclusion zone to 5km. It also said police would get new powers to tackle drone misuse.
In Germany an amendment to air traffic regulations entered into force in 2017 that prohibits drones being flown within 1.5km of an airport. Drones are also banned from being flown in controlled airspace.
However with local press reporting rising drone sightings near German airports, with the country’s Air Traffic Control registering 125 last year (31 of which were around Frankfurt), the 1.5km limit looks similarly inadequate.