Bluetooth or WiFi

The Internet of Things (IoT) is one of the major growth markets of our time. Fundamental to the IoT concept is connectivity between devices. Typically, the preferred connectivity mechanism is wireless, using both short- and long-range wireless technologies.

The choice amongst wireless technologies used in 2016 is broad, particularly with short-range technologies (Figure 1).

image

Figure 1

In Figure 2, developers reported on which wireless protocols they were expecting to design with over the next 12 months.

image

Figure 2

As shown, Bluetooth and Wi-Fi are the most popular protocols for both IoT and non- IoT embedded developments. Their adoption is particularly high in the areas of industrial controls, consumer electronics, electronic instrumentation, and medical devices. Bluetooth usage within these industries is shown in Figure 3.

 

image

Figure 3

For some time, people have been discussing which of these technologies was best. A part of the argument was based around characteristics such as power use, range, and data throughput. The general consensus now is that each technology has some use cases to which it is best suited; and the other technologies are suited for other uses. There is no simple “one size fits all” approach. Rather, designers must analyze their own needs and make their own decisions.

Costs must figure into the equation as well. Data from the 2016 EMF survey around development costs is shown in the table. Interestingly, this data shows that the costs of Bluetooth only, Wi-Fi only, and Bluetooth and Wi-Fi combined are all very similar.

image

From this analysis, it appears that the cost of developing both Bluetooth and Wi-Fi into a product is similar to choosing one over the other. Integrating both eliminates the complex use-case analysis as well as the cost surrounding making that decision. Integrating both provides the flexibility to cater for different use cases than might have been envisioned at the commencement of development.

Bluetooth and Wi-Fi together can be handled in a single wireless chipset, such as the TI WiLink 8. Combined with Clarinox software, this can be achieved with various RTOS and MCU combinations. The Clarinox stacks are one of the few that can handle the interoperability of the two technologies. Clarinox’s common framework will ensure that priorities are handled.

So your choice should be an “and” and not an “or.”

Google Glass

 

image

 

Google Glass is no longer being marketed to consumers, but its enterprise business continues to pick up pace, and today one of the more promising companies developing medical services using Google’s connected eyewear is announcing a significant investment in its technology, which aims to “rehumanise the interaction” between doctors and patients by pulling physicians’ faces away from their computer screens, according to its CEO.

Augmedix, a startup out of San Francisco that has developed a platform for doctors to collect, update and recall patient and other medical data in real-time, has raised $17 million in a strategic round.

The investment is significant because of who is making it: it comes from five of the biggest healthcare providers in the U.S. — Sutter Health, Dignity Health, Catholic Health Initiatives (CHI), TriHealth Inc., and a fifth that is remaining unnamed for now. Together, these groups — which operate hospitals and other facilities, and in other aspects compete against each other for business in the healthcare industry — cover about 100,000 doctors and other healthcare providers and millions of patients. The idea will be for Augmedix to supply these physicians and other staff with their connected eyewear.

This investment comes after a Series A but before a full Series B (which CEO Ian Shakil said in an interview is not being raised right now), and it brings the total raised by Augmedix to $40 million. After its last round of $16 million, Augmedix was valued at around $100 million, and while Shakil is not disclosing the valuation of the company, he told me that this was a “healthy up-round.” From what I understand, the valuation is between $120 million and $160 million, closer I think to the latter of the two.

One of the big criticisms of Google Glass (among other wearables) has been that devices that you put on your face can alienate you from people you are interacting with — both because they put people off, and also because they distract you, the wearer, from focusing on the person in front of you.

Ironically, it seems that the exact opposite of this is the reason behind Augmedix’s growth to date. Shakil noted that one of the big problems today in U.S. medical systems is the amount of data that doctors and others on the medical team are required to reference and input for each patient.

“When you are with doctors without Glass, they are charting and clicking on computers for a lot of the time, and not focusing on their patients,” he said. “When you put on Google Glass to collect and reference that information, it helps you engage with the patient better.” Shakil added that the Augmedix system “takes care of documentation in the background faster than you would. It humanises the process.”

This is also what attracted the strategic investors it seems, even to the point of putting in money alongside competitors.

“At Dignity Health, we are committed to developing partnerships that harness the great potential of technology and apply it in ways that help patients and providers make better day-to-day decisions about care,” said Dr. Davin Lundquist, chief medical information officer, Dignity Health, in a statement. “The use of Google Glass and Augmedix allows our doctors to spend more time with patients by eliminating the distraction of entering information into a patient’s electronic medical record on the computer. This enables our healthcare providers to give more focused attention to our patients and results in a better patient experience.”

“As we strive to create the high-quality, high-value healthcare experience our patients expect from Sutter Health, new technology tools and services allow us to innovate in ways that deliver a more efficient, affordable and personalized level of care,” said Dr. Albert Chan, Sutter Health’s vice president, chief of digital patient experience, also in a statement. “Wearable technology holds tremendous promise, especially for enhancing the office visit experience. We are committed to partnering with our patients, and value how our growing network of digital health innovators helps strengthen those patient-doctor relationships in new ways.”

Interestingly, the humanizing doesn’t end at the patient end of the system. The software that Augmedix currently uses relies on a large team of humans to help enter info and update records in the back end. “It’s almost more powered by humans than AI and speech recognition today,” Shakil said.

However, he added that part of the funding is going to build out more of the tech using some of the later innovations in the field: “We will be deploying more natural language processing in the future. It creates more efficiencies for us to do so.” That may be using tech from Google (which is ramping up in this space), but just as likely Augmedix will consider solutions from Nuance and others, he said.

Google Glass always felt and continues to feel somewhat like a niche play, so just how big is Augmedix today? Today there are “hundreds” of doctors already using Augmedix’s software on Glass, concentrated in Southern and Central California, Shakil said. That may not sound like a lot, but Shakil points out that each doctor pays “low-single digit thousands of dollars” each month, which works out to a “very reasonable” amount of annual revenue.

He said the company is on track to have thousands of doctors using this by next year, with the bigger target for 10,000 doctors within five years. Considering that these five new investors cover 100,000 doctors and other practitioners, and the amount of outlay that’s already dedicated to IT in the medical industry, a ten percent penetration rate doesn’t sound too outlandish.

Currently, Glass is at the center of what Augmedix does today, but it sounds like this isn’t something the company is necessarily wedded to for the long term. Indeed, while Google was something of an early mover with Glass (cleverly lowering the bar for building solutions with its Enterprise edition), the world has moved on when it comes to connected headsets that feed its users information. Hardware now includes other smart eyewear to full-on augmented reality and virtual reality gear from the likes of Facebook’s Oculus, Meta, Microsoft, Samsung and more.

Shakil says that for now its solution and business is focused on Glass (note: among other VCs like DCM and Emergence, Google itself has not invested in Augmedix). But it is also testing other alternatives in what Shakil refers to as “light AR.”

Down the line, Augmedix wants to add more services on to its platform to better complete the loop. This will include patient-oriented features, “so that the patient can go home and relive the visits and listen again to what the doctor said” or be taken through demonstrations for self-care.

Augmedix also wants to add more guidance for doctors, to help them remember different points for, say, smoking cessation regiments or other clinical work. Way further down, you could imagine how this might extend into other aspects of a doctor’s work, such as during procedures.

Digital Wallet

image

The “wallet” in the modern sense of “flat case for holding paper currency” dates back almost 200 years. The word itself goes back 700 years, and the concept (minus paper currency) for millennia.

Leather wallets were not “smart,” of course; they were atom agnostic, payment type agnostic, even, as credit cards and the like started proliferating in the mid 20th century. But today the payment type is almost a pointer — in computer science vernacular — to a source of money. And the wallet itself is the master pointer, used for opening and closing a transaction, and choosing which sub-pointer to assign.

Because intercepting the payment leads to a whole downstream treasure of goodies, the wallet — once tanned animal hide — is going to be the ultimate financial platform. As digital wallets increasingly become the origination point for consumer spending, they will become THE platform for downstream financial services — creating an opportunity for startups and a problem for established players.

The problem, of course, is that a payment type can become a wallet, and a wallet can become a payment type. So which is which? If a ridesharing company has 100 million credentials, they’ve solved half of the network effect problem of being a payment company — so you could imagine using that app as your wallet at, say, Walmart. Or Starbucks, which is one of the biggest wallets, has a pointer within its wallet to Visa Checkout, another wallet, pointing to a card type (a Visa card, or even a MasterCard/Amex/Discover card), pointing to a “loan” (the “credit” part of a credit card), ultimately pointing to a bank account.

As a stack, we have hardware — your mobile phone — at the top and bank accounts holding the actual treasure at the very bottom. But it’s better to think of this “stack” as really a system of pointers, in this case downwards. And the goal for businesses is finding and occupying a defensible position in this stack that allows them to intercept payments, capturing and controlling value to become that ultimate financial platform.

 image

On my Apple iPhone, I can run the Uber app and pay via PayPal, which deducts the money from my American Express Card. For the first four components of the stack — hardware, operating system, app, and cloud backbone — the heuristics of success for capturing value are the number of integrations (i.e., the number of places people can use the wallet) and number of credentials (users who have committed more than one tender type).

Given the massive number of credentials they have and the controlling position as the “start” of the stack — unlike other players, Apple has both hardware and OS — Apple’s wallet as platform could deal a crippling blow to everyone down the stack.

Especially because the flow in this stack only goes one way: players below don’t get access to the resources up the stack.

In a mobile-only world, a well-coordinated effort to let you simply touch your thumb to your iPhone to pay on any ecommerce site or app could wipe out probably 20% of PayPal’s revenue, overnight [two-thirds of PayPal revenue is merchant services off eBay; assume 25% iOS share].

The real value of occupying a defensible place in this stack is not even in processing the payment, however. Capital One spends a lot of money every year convincing you to apply for and use its credit card. Once subsumed under a digital wallet, though, that “usage” component gets further and further out of Capital One’s control, with tremendous implications on downstream interest (lending) revenue.

One change in Apple’s product design — for example, something as simple as alphabetization, which a leather wallet doesn’t do! — could move Bank of America ahead of Capital One as a “default,” moving more purchases in that direction.

The “end result” of this whole system of pointers is usually an increasing balance on a revolving credit facility — a credit card. Take LendingClub and Prosper, the two biggest marketplace lending companies.

 image

About half of LendingClub’s loan originations come from refinancing credit card debt, which they source via U.S. Postal Service mail ads, Google ads, etc. But controlling a position in the purchase stack could and arguably should replace their normal customer acquisition process; rather than waiting for a consumer to accrue a large balance from a series of purchases (at a ridiculously high credit card interest rate) and then refinance, catch it as the balance comes in from purchases.

The next large consumer finance company is likely to interrupt this chain of pointers. But at which point in the stack? It will be very challenging to attack the top of the stack as that would require a hardware+OS wallet with massive adoption and a massive number of payment credentials. And attacking the bottom of the stack is challenging as well …and relatively unprofitable at that.

Right now, LendingClub will take your 18% APR Chase/Citi/et al interest rate and refinance it down to 10%. But in a world where ApplePay controls the front and existing banks like WellsFargo provide the source of funds at the end, there’s no reason not to “automate away” the credit selection process. Why wouldn’t they just skip right to the rate LendingClub would have given you, or even skip to the best “marketplace lending” rate?

The biggest dislocation once that happens will be that your “credit card” will no longer be the default source of medium/long duration credit. This has major implications for all of consumer finance.

The future: Wallet apps, rewards, insights

For credit card companies, the smartest thing they can do is to not build their own cloud wallet, which creates an unnecessary “sub-pointer”. Yet many of them are doing this because they’re missing the full view of where value lies in the stack and how to better leverage their position within it.

For cloud wallets — which are facing the existential challenge of being caught, literally, in the middle — the smartest thing they can do is align themselves with a winning application wallet if they’re losing in acquiring enough credentials on their own. Because payment companies (e.g., Chase, Citi, etc.) risk being abstracted into irrelevance, attaching themselves to the winning application wallets (the likes of Amazon, Lyft, Starbucks, Uber, etc.) is one of the only ways to prioritize their existing “pointer” vis-a-vis others.

So what does this all mean for startups? Well, it will become easier to address the most profitable part of the stack — lending — without getting into the herculean and quixotic path of payments. Companies like PayByTouch over 8 years ago and Powa more recently together evaporated over $500M before dissolving into bankruptcy. Today, there are many other parts of key infrastructure that can be rewritten with access to digital wallet-as-platform: rewards, PFM (personal financial manager, à la Quicken/Mint), merchant recommendations, offers, etc.

There is also a whole generation — millennials — who don’t understand the notion of balancing a checkbook because they don’t even have a checkbook.

It’s an anachronism, as are the PFMs that grew up around that notion, including long delays before purchases show up in “modern” PFMs.

This is because the purchase goes from merchant, to merchant bank, to network, to issuing bank, to aggregator, to PFM. But in digital wallets, purchases show up instantly — allowing recommendations, offers, and discounts to be instantaneous. Digital wallets may finally enable the long-sought “taste graph” (the mother of all “people who bought this, also bought that”) to be built.

All of this will require the “top” of the stack to open up — for Apple, Google, and other players to recognize they are building a financial platform, which like all platforms are most valuable when developers have access. Given the size of the market, it’s a question of when, not if, this will happen. And once it does, it’s likely to be a game changer for the banks that for decades have relied on branches and consumer branding, and for startups who will finally find themselves with a capital-efficient entry point for disrupting consumer finance.

USB-C adds authentication protocol

USB-C Adds Authentication Protocol

The USB 3.0 Promoter Group has announced it has devised and will adopt a new “USB Type-C Authentication specification.”

The specification means makers of USB devices will be able to encode them with information about their source and function. When connecting to those devices, machines like computers or phones will be able to read that descriptor and choose to connect, or not, depending on policies.

The USB 3.0 Promoter group says “For a traveller concerned about charging their phone at a public terminal, their phone can implement a policy only allowing charge from certified USB chargers.” Or perhaps you're worried that your organisation's laptop fleet could be compromised by rogue USB devices, in which case you “can set a policy in its PCs granting access only to verified USB storage devices.” It's not clear if that will allow organisations to specify individual devices, or just devices whose manufacturers have implemented the spec.

USB-C needs this spec for two reasons. Once USB-C becomes ubiquitous and makes a single wire responsible for carrying power and data, hackers will likely exploit opportunities through chargers or devices.

The second is that there are lots of second-rate electronic such as poorly-wired cables capable of destroying equipment. Amazon.com recently banned the sale of non-compliant cables from its web site. If devices flag such parts as sub-standard, or refuse to connect to them, it's therefore good news for the end user.

Details of the specification can be found in the revised USB 3.1 spec (54MB .ZIP file. The TL:DR version is that it “references existing internationally-accepted cryptographic methods for certificate format, digital signing, hash and random number generation,” so it sounds like a conventional issue-certificates-and-check-them protocol.

Intel's Compute Stick

Intel's Compute Stick

The Intel Compute Stick (ICS) can be thought of as the offspring of a Raspberry Pi and Google Chromecast. It emerges as a tiny computer CPU, RAM and storage on a small motherboard contained within a reasonably well finished case. Protruding from the case is a HDMI male adapter ready to plug into any display boasting its female counterpart.

The ICS is a full working PC with Windows 8.1 for Bing a quad core Atom processor Z3735F running at up to 1.83 GHz, 2 GB memory, 32 GB of on-board storage, b/g/n WiFi, Bluetooth and a microSD card slot.

An Ubuntu version reduces the RAM to 1GB and storage to 8GB. As the RAM is soldered on-board and cannot be upgraded, buying the Windows version and installing Ubuntu rather than making do with the reduced memory size could be the best option.

There is only a single USB 2.0 port on the ICS, due to the very limited size of the stick. For further expansion an external USB hub is required.

The ICS has an active airflow design, as it sucks in air at one end, goes over a heat-sink and then blows it out the other. The fan can barely be heard when idling and even when the CPU is under a heavier load, can only be heard in a silent room.

Wireless and Bluetooth capabilities are included as standard. Although the wireless works quite well, it may not be enough for streaming high amounts of data, such as 3D graphics.

In use, one end plugs into a display through its HDMI port and power is provided through the micro USB port. The power plug provided puts out power at 2A, which seems to be the standard in newer smart phones and tablets these days.

 

The Intel Remote Keyboard is a new app for Android* and iOS* mobile devices that allows users to wirelessly control their Intel® NUC and Intel® Compute Stick. The app allows the use of a smartphone or tablet as a keyboard and mouse, improving your HTPC or entertainment experience.

 

Features

  • Default Keyboard that includes PC keys such as Windows*, CTRL, ALT, ESC
  • Optional Native Keyboard that enables support for custom keyboard capabilities such as Swype* and dictation
  • Full Windows®10 gesture support
  • Sensitivity slider for high DPI mobile devices
  • Support for multiple languages: English, Arabic, Chinese (simplified and traditional), French, German, Italian, Japanese, Korean, Polish, Portuguese, Russian, and Spanish.

 

System Requirements

  • Mobile App – Android 4.0* and later, iOS 7* and later
  • Host App – Windows* 8.1 and later; Linux* coming in late 2015

 

Setup

  • Download the Intel® Remote Keyboard mobile app from the Google
  • Play Store* or Apple App Store* for your mobile device.
  • Download and install the Intel Remote Keyboard host app from the Intel® Download Centre on your Intel® Compute Stick or Intel® NUC.
  • Open the app on your mobile device.
  • Select the computer from the device list and scan the QR code on your computer’s screen (first use only).

Protect Your Privacy and Security with these Tools

Protect Your Privacy and Security with these Tools

Across the web companies are collecting information about you whether you like it or not. Knowing which companies are more trustworthy with your information than others and how to keep yourself safe on the internet is important.

If you are not sure whether  the websites you use make the grade, then check out their policies with regards to data sharing.

Surprisingly, when you log in to most mobile or web applications, at least 15 pieces of information are sent in every direction. Do not let companies take advantage of the fact you might be unaware of this and take action if you catch them claiming one thing, but doing another.

You can get justice by reporting the company to the Federal Trade Commission.

If you are a company reading this, and concerned that you are lacking in a good set of privacy and security practices, check out the FTC’s best practices guide – otherwise, watch out.

Free Tools For Privacy Verification

Here are some free tools you can use to keep a company in line with their privacy practices. There are also some tools to prevent third parties (companies who track you) from getting your information as you browse the web.

 

Mitmproxy

Using Mitmproxy, a free “behind the scenes” tool, you can do some investigation to find out whether a company collects more information about you than you would expect or if it sends your information insecurely.

 

PrivacyGrade

PrivacyGrade is a website that allows you to see how robust a company’s mobile application is at protecting your privacy.

AdBlock

The AdBlock browser extension prevents ads from appearing on your browser when you are visiting sites. It also prevents some third parties from receiving some, but not all, information about you.

 

Do Not Track

The Do Not Track browser extension also attempts to minimize the information about you that third parties collect by informing them (many have opt-out policies) that you prefer not to be tracked.

 

Cookiepedia

Cookiepedia is an index of thousands of companies that try to collect information about you by placing cookies in your browser. 

If a company lists the third parties that it uses in their Privacy Policy, you can look them up here to learn more about what they do with your data.

 

 

AppAnnie

AppAnnie gives you metrics about mobile applications and has some good aggregate statistics about sharing policies of different application industries.

You can use these to evaluate whether a mobile application you are using shares too much information relative to the average company in that sector.

Free Tools for Security Verification

Here are some free tools you can use to keep a company in line with their security practices, as well as a tool that you can use to enhance your own security in lieu of a bad-practice company.

 

TRUSTe

Look for TRUSTe‘s seal of approval as it provides companies with data privacy management solutions.

 

Verisign Trust Seal

Run by Symantec, look for the Verisign Trust Seal of approval on websites which ensures SSL encryption.

Infoencrypt, SafeGmail, Hushmail or Lockbin

Infoencrypt, SafeGmail, Hushmail or Lockbin – use these free services to send all of your emails encrypted.

 

Dashlane

Use a password manager like Dashlane Password Manager that gives you one master password for all of the passwords you use across the internet.

A manager will set your password for other companies to be really long and complicated (i.e. difficult to attack).

All you have to know is your master password and the rest is taken care of for you.

Quantum Dot Displays and BT.2020

Quantum Dot Displays and BT.2020

This modified iMac display can cover 96% of BT .2020 thanks to quantum dots

When full HD TVs first became available, there was still a scarcity of content to be watched in 1920 × 1080 resolution. Today, history is repeating with 4K, high dynamic range (HDR) and quantum dots. The technology is available, but there is a lack of suitable content. Moreover, every manufacturer has been implementing these technologies differently and it has resulted in a lot of confusion. So the new UHD standard announced during the Consumer Electronics Show was well overdue.

The UHD Alliance is a consortium regrouping technology companies and heavyweights from the film industry such as Universal Pictures, Twentieth Century Fox, and Walt Disney Studios. In a press release, they described the major features of the new standard. In order to be certified Ultra HD Premium, a device will need to satisfy minimum specifications in terms of resolution, bit-depth, dynamic range and colour gamut.

 

Encoding in BT .2020

In a way, the certification can be seen as just another attempt to reassure consumers their new device is “future-proof”. For manufacturers, the specifications are loose enough to accommodate everyone. For instance, the standard requires 1000 nits at peak brightness, unless the device is capable of very low black level (in which case 540 nits peak brightness is considered sufficient). It is obvious this decision was made to allow OLED TVs to be certified Ultra HD Premium, since high brightness is still an issue for OLED.

Where the new standard makes significant changes is in content production and distribution. In particular, all content labelled Ultra HD Premium will be encoded with BT .2020 colour representation. This is a major step up from the old BT .709 colour space which is currently in use for most consumer content.

In the last two years, display manufacturers have commercialised what they call 'wide colour gamut' LCD panels. These displays use new phosphors or quantum dots to produce a wider range of colours. However, content encoded in BT .709 cannot fully exploit the capabilities offered by these displays. If poorly calibrated, it can even result in images looking garish or over-saturated.

BT. 2020 is a much bigger colour space than anything used before. Therefore, content encoded in BT .2020 will reveal how good the displays Quantum Dot Adoptiontruly are in terms of colour reproduction.

 

Quantum Dot Adoption

For quantum dot suppliers, more content in BT .2020 will be a great opportunity to demonstrate the benefits of wide colour gamut displays. However, devices certified Ultra HD Premium will only have to cover the DCI P3 colour gamut, which is much smaller than BT .2020. So the new standard sets the bar very low when it comes to display performance.

This enables Samsung to go ahead with their new range of quantum dot TVs. Samsung has decided to use cadmium-free quantum dots, which are labelled as safer than cadmium-based materials. However, colour reproduction with cadmium-free quantum dots is currently lower than 80% of BT .2020.

This is likely to improve in the future as manufacturers learn how to get narrower emission peaks. In the meantime, quantum dots based on cadmium remain the best for displaying BT .2020 content. For example, QD Vision has demonstrated a coverage of 96% BT .2020 using cadmium selenide quantum dots.

The new standard therefore helps quantum dot adoption in two ways. It lets cadmium-free technology get the certification while at the same time encourages consumers to seek the best display to watch content in BT .2020.

Although quantum dots are only used in a few TV models at the moment they will be increasingly important for display manufacturers as the content becomes available.

Market forecast for quantum dot devices and components (source: IDTechEx Research report

Quantum Dots 2016-2026: Applications, Markets, Manufacturers”)

By 2018, the market for quantum dot display components will reach $1.8bn according to forecasts by IDTechEx Research.

It is of course still early to say whether the new standard will be popular. But it represents the best attempt so far at defining how consumers should experience Ultra HD. It is also worth noting that both Netflix and Amazon have contributed to the specifications. With content now being distributed directly over the Internet by these providers, the transition to a new standard might move forward quickly.

VESA announces DisplayPort 1.4 standard with support for 8K displays.

VESA announces DisplayPort 1.4 standard with support for 8K displays.

standards, specifications, vesa, usb type-c, 8k, displayport 1.4, 8k resolution, 8k display

The Video Electronics Standards Association (VESA) has finalised and published the DisplayPort 1.4 standard. The latest version, includes a number of new features and specifications that may lead some to miss DisplayPort 1.3 entirely.

DisplayPort 1.4 retains the same High Bit Rate 3 (HBR3) physical interface as its predecessor but utilises Display Stream Compression (DSC) technology. DSC version 1.2 enables up to a 3:1 compression ratio that's said to be visually lossless according to VESA testing.

As such, DisplayPort 1.4 can drive 8K displays at 60Hz and 4K displays at up to 120Hz – both with HDR "deep colour." The new standard also supports 32 audio channels, 1,536kHz sample rate and inclusion of "all known" audio formats.

VESA further notes that the 1.4 standard features forward error correction and HDR meta transport in addition to the expanded audio support. Importantly, the latest standard will work over both DisplayPort and USB Type-C connections.

It will be some time before devices actually implement the new standard. DisplayPort 1.3 was announced in September of 2014 but as multiple publications note, there still are not any products on the market that utilise it. Because of this, manufacturers could skip DisplayPort 1.3 entirely and jump to the latest standard.

BBC's Micro:Bit computer

BBC's Micro:Bit computer

Micro:Bit photo

The BBC Micro:Bit will start rolling out to all year-seven pupils in the UK from the end of March.

It consists of a matchbox-sized single-board computer with 256KB of flash and 16KB of RAM, manufactured by element14.

The memory size is small because the development environment is done by apps and browsers running on tablets, phones or PCs. This is as a board for running simple programs and interfacing with the surrounding world.

The Micro:Bit comes with lesson plans, obligatory videos and an array of code samples.

Although this is not a Windows or even a .NET platform, Microsoft has provided two languages for programming, and much of the course material.

 

Why not use the Raspberry PI

Many enjoy Linux command line interfaces, but not the average 12-year-old. Micro:Bits are designed to be used by the smartphone generation and the BBC, ARM et al want to reach all children, not just those whose parents force-feed them C++.

 

Micro:Bit ships ready for four development systems.

Entry level is Microsoft Blocks, which is drag’n’drop flowcharts similar to Scratch. Here you simply assemble programs and fill in blanks. It is possible to create something which scrolls text or reacts to being picked up without typing any text at all. 

The shaking is detected by the accelerometer and messages can be shown on the five-by-five grid of LEDs. This is a basic Internet of Things concept, which teenagers are going to find useful now or later in their career.

You get event driven programming, loops, simple variables and the ability to make it complain if you pick it up. Most children are motivated by a “gadget” doing something rather than the boring miracle of instant message processing.

The BBC is pushing the idea of “creativity”, such as making it into a message-displaying badge and for it to respond to text messages through a phone. This can be achieved quite quickly, unlike those BBC Micro days at the dawn of the 1980s when it took three days typing in code from a magazine.

Microbit Javascript

JavaScript made stupidly simple by CodeKingdoms

Microsoft TouchDevelop has been in use for a while as a simple way to get started in mobile app development for Android and WinPhone. The feedback has been pretty good and again it is helpful and simple.

Code Kingdom’s JavaScript is the next step towards real programming with the drag’n’drop interface to help think about algorithms more than syntax, but it allows movement stepwise between simple-but-limited blocks to straight hardcore text-based coding.

These work by pushing the whole program, plus the Lancaster University Runtime, down the line to the Micro:Bit in an IoT rather than PC development style. The runtime includes a Device Abstraction Layer that means languages plug into a jump table and novice programmers don’t need to care much about hardware at all.

Part of the reason for having two Cortex MPUs is to get problem of having one processor doing the booting, loading and communicating with the host; corruptions can lock the device, which is annoying for an adult developer and catastrophic for children.

So MicroPython can communicate with the host for interpreted debugging and development, which means you can get started on what are rapidly becoming the most common languages for teaching computer science in schools: JavaScript and Python. As the computer is based on ARM’s mbed platform, it can also support C++, for more users. The Micro:Bit has an edge connector and comes with banana crocodile clips to connect to more devices and sensors. Companies such as Already Tech Will Save Us and Kitronik have them ready and available.

The BBC say it will open-source the whole thing and turn it over to a non-profit outfit that will sell Micro:Bits and develop it for the next year.

 

Hardware

The Micro:Bit sports low-power Bluetooth, so it can be paired with a phone or tablet to respond to texts or be programmed via Samsung’s Android app.

The board's Nordic Semiconductor ARM Cortex core handles Bluetooth as well as the actual execution of the self-contained apps. The NXP Cortex microcontroller communicates with the host via USB, as well as a magnetometer for compass direction, and accelerometer for motion.

The five-by-five LED grid is surface mounted with power coming from AAA batteries which will keep it going for a few days.

Finally

The Micro:Bit is not in the same space as the Raspberry Pi, that other great small computer board aimed at the next generation. The Pi runs Linux and – at a push – something approximating a version of Windows, which the Micro:Bit does not.

Rather, it aims more at the middle ranks of children who haven’t yet got coding in their blood.

AMD Unveils 64-Bit ARM-Based Opteron A1100 SoC With Integrated 10GbE For The Datacenter

AMD Unveils 64-Bit ARM-Based Opteron A1100 SoC With Integrated 10GbE For The Datacenter

AMD is adding a new family of Opterons to its enterprise processor line-up today called the Opteron A1100 series. Unlike AMD’s previous enterprise offerings, these new additions are ARM-based processor cores, not the X86 cores AMD has been producing for many years. The Opteron A1100 series is designed for a variety of uses, including networking, storage, dense and power-efficient web serving along with 64-bit ARM software development.

The Opteron A1100 System-on-Chip (SoC), was formerly codenamed - “Seattle” and it represents the first 64-bit ARM Cortex-A57-based platform from AMD. The AMD Opteron A1100 utilises off-the-shelf ARM Cortex-A57 processor cores, with integrated high-speed network and storage connectivity.

amd a1100 overview

The AMD Opteron A1100 Series SoCs will pack up to eight 64-bit ARM Cortex-A57 cores with up to 4MB of shared Level 2 and 8MB of shared Level 3 cache. They offer two 64-bit DDR3/DDR4 memory channels supporting speeds up to 1866 MHz with ECC and capacities up to 128GB, dual integrated 10Gb Ethernet network connections, 8-lanes of PCI-Express Gen 3 connectivity, and 14 SATA III ports.

amd a1100 block diagram

The block diagram above shows the rough arrangement of the processor cores, cache, memory controllers, co-processors, and I/O of the Opteron A1100 series. In addition to all of the connectivity outlined above, the Opteron A1100 also features an ARM TrustZone compliant crypto/compression co-processor, along with a Cortex A5-based system control processor.

Each pair of Cortex A57s is linked to its own 1MB of L2 cache, hence the "up to" 4MB of shared L2 cache listed in the slide. Though the top-end A1100s feature eight Cortex A57 cores, quad-core models will also be offered that have a quartet of cores and their accompanying L2 cache disabled.

amd a1100 models

In terms of product options, there will be three initial A1100-series Opterons. The top end model -- the A1170 -- features 8 cores, with a max CPU frequency of 2GHz. The A1150 has a similar core configuration, but clocks at a lower 1.7GHz peak, while the A1120 has 4 cores and 2MB of cache and also clocks at 1.7GHz. 

All of the chips have the same memory limitations and operating temperature range, but the top two parts have somewhat higher 32W TDPs due to their higher core counts, versus the quad-core A1120's 25W.

opteron a1100 packaging

The AMD Opteron A1100 series SoCs employ a 27mm2 BGA package with a ball count of 1021. The die is exposed and does not an integrated heat spreader like most current desktop processors.

ddr3 config

ddr4 config

 

 

 

 

 

 

The Opteron A1100 series SoCs work with both DDR3 or DDR4 memory types.  DDR3 memory will be for lower-cost, and potentially lower-clocked solutions. DDR3 configurations also supports much lower memory densities -- up to 4x lower when RDIMMs are used.

The DDR3 configurations also lack support for Address Parity and require higher voltage, and hence could use more power. To achieve the max 128GB supported by the platform, DDR4 must be used.

amd a1100 partners

 

 

 

 

 

 

 

 

AMD has a number of software and hardware partners for the Opteron A1100 series.

SoftIron has a 1U 64-bit ARM developer system already available and Beaconworks will offer a range of network attached storage products.

Silver Lining Systems is offering its PCIe Fabric Interconnect Adapter, the SLS FIA-2100, for AMD Opteron A1100-based servers, 96 Boards will also be bringing out a low-cost ARM development platform, and Caswell has an NFV (network-function virtualization) platform in the works. 

A handful of software partners are also supporting the AMD Opteron A1100 with operating systems and applications, including Red Hat, Suse, Enea, and Linaro.
AMD is shipping Opteron A1100 series products to a number of software and hardware partners now and development systems are already available.

MINIX NGC-1 Intel Braswell-Based

MINIX NGC-1 Intro And Specifications

One of the many benefits of Intel’s strong focus on power efficiency in recent years, is that relatively high performance processors are now able to fit into tiny form factors. The recently released Intel Compute Stick for 2016 has a quad-core processor, 2GB of RAM, 32GB of storage, 802.11ac Wi-Fi and a number of external ports in a package not much bigger than a pen drive.

However the latest Compute Stick does have some drawbacks. With just 2GB of RAM, it has just the bare minimum needed to run Windows 10 semi-smoothly and the 32GB of integrated storage cannot be upgraded. The small enclosure also means the CPU inside must be actively cooled as there is not enough space for a passive heat-sink large enough to cool the quad-core Atom CPU.

image

MINIX has been producing ultra-small form factor systems and accessories for a few years now featuring ARM and Intel-based technology. The latest device to come from the company, the NGC-1 is one of their more powerful devices, featuring an Intel’s Braswell-based Celeron processor.

Braswell is the follow-up to Intel’s BayTrail SoC and is manufactured on Intel’s tri-gate 14nm process. Like BayTrail, Braswell is a low-power architecture, designed for entry level 2-in-1 devices, laptops, and small form factor system.

Along with the Braswell-based Celeron N3150 inside the system, the MINIX NGC-1 also features 4GB of RAM, a 128GB M.2 SSD, and a 64-bit edition of Windows 10. The form factor, while still small, also affords more connectors and I/O than something as small as the Compute Stick.

MINIX NGC-1 Mini PC
Specifications & Features
Processor Quad-Core Intel Celeron N3150
GPU Intel HD Graphics
Memory 4GB DDR3-L
Internal Storage 128GB M.2 SSD
Wireless Connectivity 802.11ac Dual Band Wi-Fi (2.4GHz/5GHz), Bluetooth 4.2
Operating System Windows 10 (64-bit), Ubuntu OS
Video Output Mini-DP, HDMI 1.4 (up to 4K @ 30Hz)
Audio Output Via HDMI 1.4, 3.5mm stereo jack, optical SPDIF
Peripheral Interfaces RJ-45 Gb Ethernet, USB 3.0 x 3, headphone / mic jack,
Kensington Lock
Power DC 12V, 3A Adapter Included (CE, CC Certified)
 
minix ngc 1 main features
 


The MINIX NGC-1

The MINIX NGC-1 is a very small device at about 12.5mm square by 2.5mm high. Having rounded corners also makes it appear quite a bit smaller. Apart from the various ports, the NGC-1 features a completely sealed, all-metal enclosure that has a high-quality feel. The enclosure has a dark-grey, gun metal type finish, with MINIX laser etched into the top.

 
minix ngc 1 silent mini pc 6
 
Because the quad-core Celeron N3150 and other components inside the NGC-1 are passively cooled, there is no need for vents or holes in the system’s chassis. As long as there is some air-space around the system itself (the top cover gets warm to the touch), it runs reliably without overheating.
The Braswell Celeron N3150 CPU has a base clock of 1.6GHz (turbo clock of 2.08GHz), 2 MB of L2 cache with only a TDP of 6W. Graphics is based on an integrated Intel HD Graphics engine with 12EUs, and base / burst frequencies of 320MHz / 640MHz, respectively. There is also a single-channel of 4GB of DDR3L-1600 memory in the NGC-1, along with a 128GB M.2 solid state drive.
 
minix ngc 1 silent mini pc 3

minix ngc 1 silent mini pc 2
 
minix ngc 1 silent mini pc 4
 
On the front of the device, there are three full-sized USB 3.0 ports adjacent to the power button. The right side has a headphone jack, mini-DisplayPort connector, an HDMI port, an S/PDIF optical output, an RJ-45 gigabit LAN jack and the power connector.
 
On the rear is the antenna mounts for the built in Wi-Fi (Intel AC 3165) and a lock port. On the left there is a tiny, sole, LED indicator that lights up blue when the system is powered on and active, and turns green when the machine is asleep.
minix ngc 1 silent mini pc 5
 
The MINIX NGC-1’s design language is modern, attractive, and understated. The bottom of the device is relatively clean, apart from the rubber feet and some logos silk screened in the centre. The only thing that distracts from the overall look of the device are the relatively large, external Wi-Fi antennas.

 
minix ngc 1 silent mini pc 1
 
The MINIX NGC-1 comes with a 64-bit edition of Windows 10 Home pre-installed.  Setting the machine up is as simple as connecting power, input devices, and a monitor, then turning it on, entering some basic user information for Windows login purposes, and then letting the OS run a few updates.
 

MINIX NGC-1 Mini PC Performance

It may technically be a quad-core PC, but the MINIX NGC-1 is not meant for high-performance computing applications. It is designed for media consumption and basic computing needs. There is no point in carrying out an extensive array of benchmarks, but rather test the MINIX NGC-1 in a few of the scenarios it was designed for.

 

Processor Arithmetic

Memory Bandwidth

 

Processor Arithmetic

Memory Bandwidth


In the four SiSoft SANDRA modules run, (Processor Arithmetic, Multi-Media, Memory, File System), the MINIX NGC-1 performed in-line with expectations. The Braswell-based Celeron processor powering the device performed about on par with similar Atom-derived products.
 
In the Multi-Media benchmark, performance was similar to Celerons and some older Atom-class processors. Memory bandwidth from the single-channel of DDR3 RAM peaked at just under 6GB/s, which is fairly low. The internal M.2 128GB SSD showed a reasonable performance, averaging around 142MB/s, but peaking close to 200MB/s.

 
suns
 
SunSpider was tested to see how the MINIX NGC-1 performed while browsing / executing JavaScript, using the Microsoft Edge browser and got a score of just over 395ms. That puts the MINIX NGC-1 well ahead of the latest Compute Stick and in-line with some of Samsung's recent high-end smartphones.
 
lame

Here are some quick numbers in a simple audio-encoding tool to show where the MINIX NGC-1's Celeron N3150 falls in comparison to some other x86-based low-power processors.
 
In this test, the MINIX NGC-1 outpaces the Compute Stick once again and almost catches the AMD A4 APU.


captain america cpu ut
Captain America: Civil War Trailer, YouTube, 1080P - Scaled To 4K

Streaming SD and HD video from YouTube worked very well. The image above is from a Full HD version of the Captain America: Civil War trailer, set to 1080p, but scaled to full screen on a 4K display connected to the NGC-1 via HDMI. It played lag-free using only the built-in WiFi to connect to the web. Regardless of the resolution, this clip played back smoothly, and as can seen above, the CPU utilisation remained relatively low throughout.

 

suicide squad
Suicide Squad Trailer, 4K (2160P), Full Screen

This shot is from the Suicide Squad trailer, streaming from YouTube at 4K (2160P), with the MINIX NGC-1 connected to a 4K display. Unlike the Compute Stick, which struggled to keep this media playing smoothly, the MINIX NGC-1 handled it with no problem. The combination of the NGC-1's slightly faster SoC, additional memory and faster storage allowed it to handle something more taxing like full-screen 4K video much better.

In addition to streaming from the web, an assortment of 1080p MKV, MP4, and AVI files were tested and all played perfectly.
 

MINIX NGC-1 Streaming And Power

The MINIX NGC-1 is also a good candidate for thin-client applications, or for remote controlling other systems.

teamview
TeamViewer 11 Running On The MINIX NGC-1 @ 4K

The MINIX NGC-1 worked perfectly using Windows' built-in Remote Desktop tool and other remote support tools like TeamViewer. Accessing a higher-end system remotely from the MINIX NGC-1 should not be a problem at all.  In this scenario, the MINIX NGC-1 is connected to a 4K display and remotely connected to another PC, which is connected to a much lower resolution display -- hence the full resolution / desktop in a relatively small window.

 

minix power

Here are some power consumption figures from a variety of usage scenarios. The MINIX NGC-1 uses between 6 and 11 watts of power.
 

MINIX NGC-1 Summary And Conclusion

 
Performance Summary: The MINIX NGC-1 is a capable, silent, ultra-small form factor PC, that is well suited to media and thin-client applications.
 
Running a full, 64-bit edition of Windows, there is software available to playback virtually any file type, and the tests showed that the system was perfectly capable of local playback and streaming of all types of media up to 4K resolutions.
minix ngc 1 silent mini pc 7
The system is well designed and appointed, offering a very good mix of hardware for its intended applications. The 4GB of RAM and 128GB SSD should give users plenty of free memory space for use in different types of applications.