Another day, another photography cheat sheet. There’s a lot of them out there, especially for those who want to break away from the auto mode and take full control of their camera settings.
The AAPicks team writes about things we think you’ll like, and we may see a share of revenue from any purchases made through affiliate links. When it comes to online videos, YouTube reigns supreme. It’s grown from being a place to watch funny videos to a marketing powerhouse.
This book is intended for busy professionals working with data of any kind: engineers, BI analysts, statisticians, operations research, AI and machine learning professionals, economists, data scientists, biologists, and quants, ranging from beginners to executives.
The goal of automation has always been efficiency. What if artificial intelligence sees humanity itself as the thing to be optimized?
Understanding electrons' intricate behavior has led to discoveries that transformed society, such as the revolution in computing made possible by the invention of the transistor.
Algorithms are language agnostic and any programmer worth their salt should be able to convert them to code in their programming language of choice.
Consider this hypothetical scenario: Bob and Alice are playing a game of Magic: The Gathering. It's normal game play at first, as, say, Filigree robots from Kaladesh face off against werewolves and vampires from Innistrad.
The computing industry progresses in two mostly independent cycles: financial and product cycles. There has been a lot of handwringing lately about where we are in the financial cycle. Financial markets get a lot of attention. They tend to fluctuate unpredictably and sometimes wildly.
New VM image — updated March 2018! I love to write about face recognition, image recognition and all the other cool things you can build with machine learning.
It’s 3 AM on a warm Thursday night in December, a usually quiet street in the Gothic Quarter in Barcelona is bustling with activity, as a cohort of 200 artificial intelligence researchers leave in single-file out of a sprawling yellow mansion.
In a few seconds, I want you to stop reading this article, and follow the instructions below. Machine learning and artificial intelligence (ML and AI) have seized Tech mindshare in a way few topics have in recent memory.
At Airbnb, we are always searching for ways to improve our data science workflow. A fair amount of our data science projects involve machine learning, and many parts of this workflow are repetitive. These repetitive tasks include, but are not limited to:
As buzzwords become ubiquitous they become easier to tune out. We’ve finely honed this defense mechanism, for good purpose. It’s better to focus on what’s in front of us than the flavor of the week. CRISPR might change our lives, but knowing how it works doesn’t help you.
If you’re a programmer or techie, chances are you’ve at least heard of Docker: a helpful tool for packing, shipping, and running applications within “containers.” It’d be hard not to, with all the attention it’s getting these days — from developers and system admins alike.
Update: This article is part of a series. Check out the full series: Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7 and Part 8! You can also read this article in 普通话, Русский, 한국어, Português, Tiếng Việt or Italiano.
Over a year ago, following an original presentation at MLConf, I wrote a blog post entitled “10 Lessons Learned from building ML systems”. At that point, I was leading the Algorithms Engineering team at Netflix and those lessons reflected lessons we had learned there over the last few years.
Disclaimer: I’m not an expert in neural networks or machine learning. Since originally writing this article, many people with far more expertise in these fields than myself have indicated that, while impressive, what Google have achieved is evolutionary, not revolutionary.
A year and a half ago, I dropped out of one of the best computer science programs in Canada. I started creating my own data science master’s program using online resources. I realized that I could learn everything I needed through edX, Coursera, and Udacity instead.
When we created Snips a few years ago, we did so because we believed in using Artificial Intelligence to solve everyday problems. From predicting passenger flow in public transport to anticipating car accidents, we always tried to find a way to bring the power of machine learning to consumers.
If there is one technology that promises to change the world more than any other over the next several decades, it is arguably machine learning.
Distilling a generally-accepted definition of what qualifies as artificial intelligence (AI) has become a revived topic of debate in recent times. Some have rebranded AI as “cognitive computing” or “machine intelligence”, while others incorrectly interchange AI with “machine learning”.
As a machine learning acolyte, I spent probably as much time trying to understand things like how and when to use machine learning as I did understanding the technical details of machine learning itself. Unfortunately, most of the discussion around machine learning is about the former.
I’ve worked with deploy systems in the past that have a prominent “rollback” button, or a console incantation with the same effect. The presence of one of these is reassuring, in that you can imagine that if something goes wrong you can quickly get back to safety by undoing your last change.
Estimated reading time: 12 minutes. I’m an expert on how technology hijacks our psychological vulnerabilities. That’s why I spent the last three years as a Design Ethicist at Google caring about how to design things in a way that defends a billion people’s minds from getting hijacked.
Originally published at innoarchitech.com here on March 18, 2016. Welcome to the fifth and final chapter in a five-part series about machine learning.
Anticipatory Design is possibly the next big leap within the field of Experience Design. “Design that is one step ahead” as Shapiro refers to it. This sounds amazing, but where does it lead us? And how will it affect our relationship with technology?
It is no doubt that the sub-field of machine learning / artificial intelligence has increasingly gained more popularity in the past couple of years.
It’s New Year’s 2017, so time to make predictions. Portfolio diversification has never been me, so I’ll make just one. Generative Adversarial Networks — GANs for short — will be the next big thing in deep learning, and GANs will change the way we look at the world.
How they’re different and why they’re all essential to the Internet of Things. #askIoT We’re all familiar with the term “Artificial Intelligence.” After all, it’s been a popular focus in movies such as The Terminator, The Matrix, and Ex Machina (a personal favorite of mine).
I’ve seen a few CS students fearful about the industry they’ll enter into when they graduate. And with all the recent tech news, who can blame them? Why am I even still here? This is my career retrospective — what has been great, what has been horrible, why I’m still here & fighting.
A few weeks ago, I wrote about how and why I was learning Machine Learning, mainly through Andrew Ng’s Coursera course. Now I’m checking back in with 9 weeks under my belt. Machine Learning is built on prerequisites, so much so that learning by first principles seems overwhelming.
After millions of years of evolutionary trial and error, or natural selection as Charles Darwin put it, the homo sapiens proved to be the dominant species. Was this the case because humans were expert risk takers or fear conquerors? Quite the opposite actually.
How you can setup your own Convolutional Neural Network? Lets try to solve that in this article. We will be working on a Image Segmentation problem which I discussed in the first part of this series. There are a lot of libraries available for creating a Convolutional Neural Network.
An average data scientist deals with loads of data daily. Some say over 60-70% time is spent in data cleaning, munging and bringing data to a suitable format such that machine learning models can be applied on that data. This post focuses on the second part, i.e.
For this tutorial in my Reinforcement Learning series, we are going to be exploring a family of RL algorithms called Q-Learning algorithms. These are a little different than the policy-based algorithms that will be looked at in the the following tutorials (Parts 1–3).
The design for The Growroom, an urban farm pavilion that looks into how cities can feed themselves through food producing architecture, is now open source and available for anyone to use. SPACE10 envision a future, where we grow our own food much more locally.
Developers often say that if you want to get started with machine learning, you should first learn how the algorithms work. But my experience shows otherwise. I say you should first be able to see the big picture: how the applications work.
You may have read at NYMag that I’ve been in discussions with the Clinton campaign about whether it might wish to seek recounts in critical states.
Kindness: If you are giving back you’ve already taken too much. Evolve and grow: Life’s about progress, we can either move forward and relentlessly improve or be consumed and surpassed by the horde which stands in wait behind us. Standing still is proportionate to regression.
Nick Pinkston grew up in rural Pennsylvania in a family that has spent more than three generations in the manufacturing industry. They worked in coal mining in West Virginia and then in ceramics in New Castle, Pennsylvania, where he grew up.
Every day brings new headlines for how deep learning is changing the world around us. A few examples:
In part one of this blog post, we detailed the different components of Netflix personalization. We also explained how Netflix personalization, and the service as a whole, have changed from the time we announced the Netflix Prize.
There has been a recent surge in popularity of Deep Learning, achieving state of the art performance in various tasks like Language Translation, playing Strategy Games and Self Driving Cars requiring millions of data points.
“Artificial Intelligence”: this term has become so popular/hyped/*add an adjective of your choice* in this decade, that we’re talking about it more than ever. So much so that anything about AI becomes front page news. Tech media must be having a crush on AI for sure.
Many technology companies now have teams of smart data-scientists, versed in big-data infrastructure tools and machine learning algorithms, but every now and then, a data set with very few data points turns up and none of these algorithms seem to be working properly anymore.
Google’s rollout of artificial intelligence has many in the search engine optimization (SEO) industry dumbfounded. Optimization tactics that have worked for years are quickly becoming obsolete or changing.
Whether it be reading scripts in Hollywood, deciding which stories to cover for TechCrunch and VentureBeat or what to invest in for GV, vetting the worthy vs. unworthy has been the common thread through it all for MG Siegler.
Between January and February 2017, we’ve ranked nearly 2,000 Machine Learning articles to pick the Top 10 stories (0.5% chance) that can help advance your career.
It’s 2 a.m. and half of our reliability team is online searching for the root cause of why Netflix streaming isn’t working. None of our systems are obviously broken, but something is amiss and we’re not seeing it.
In the past year, I’ve become convinced that machine learning is not hype. Strong AI/AGI is no longer a requirement for complex tasks. It doesn’t matter that AGI is out of reach, since we don’t need it in order for automation to take over vast swathes of the job market.
If you read the first article in this series, you’re already on your way to upping your math game. Maybe some of those funny little symbols are starting to make sense. Also be sure to check out parts 3, 4, 5, 6 and 7.
The creative reach of the individual is expanding. The assortment of available tools, platforms and devices for design is growing while their costs are diminishing. You can make a film, record an album, design a city or print your own flower pot.
I spend roughly 73% of my life thinking about web performance — hitting that sweet 60FPS on slow phones, loading my assets in the perfect order, offline-caching everything I can. Other examples. But recently I’ve been wondering if my definition of web performance is too narrow.
Machine learning is going to change the world more than any other technology, over the next several decades. To take advantage of the machine learning revolution we (aka product managers) should move quickly to equip ourselves with the necessary tools.
A few months ago, my friend Tim took a new sales job at a Series C tech company that had raised over $60 million from A-list investors. He’s one of the best salespeople I know, but soon after starting, he emailed me to say he was struggling.
What are some of the applications and use cases? #askIoT Machine Learning (ML) and the Internet of Things (IoT) are huge buzzwords right now, and they’re both near the peak of the hype cycle. The above quote came somewhat jokingly from an investor, but it has some truth to it too.
My least favorite moment in all of cinema is a relatively common one. You will recognize it, I’m sure, from dozens of movies and TV shows that prominently feature scientists. You may even have laughed at it once or twice. It usually gets a quick chortle. The moment goes something like this:
Take a look at the image below. It’s a collection of bugs and creepy-crawlies of different shapes and sizes. Take a moment to categorize them by similarity into a number of groups. This isn’t a trick question. Start with grouping the spiders together.
As we have described previously on this blog, at Netflix we are constantly innovating by looking for better ways to find the best movies and TV shows for our members. When a new algorithmic technique such as Deep Learning shows promising results in other domains (e.g.
You’re a startup founder. You know you need to have a “data play” (or worse, “AI play”). Investors and clients are asking about machine learning (or worse, deep learning). The question is no longer why, but when. So, you hire your first data scientist.
We cover many emerging markets in the startup ecosystem. Previously, we published posts that summarized Financial Technology, Internet of Things, Bitcoin, and MarTech in six visuals. This week, we do the same with Artificial Intelligence (AI).
It’s early summer, and I’m in Dupont Circle. Something’s off. People, I notice, seem to be suddenly tweeting much less lately. But I’ve got a book to finish, so I file the observation away to carefully inspect later. It’s late summer, and I’m standing in Madison Square, frowning.
Oliver Tan is the Co-Founder and CEO of ViSenze, an artificial intelligence company. The idea of machine learning sounds like a science fiction thriller or action movie where a computer takes over the world. It rarely goes well for the humans (remember I, Robot?).
On average, you sleep 7 hours and 50 minutes per night. Considering that life expectancy for countries in the Western world is about 80 years—you’ll spend 26.6 years of your life asleep. That’s almost 1/3 of your time on this planet. And yet, we feel tired so often.
Why is the world’s most advanced AI used for cat videos, but not to help us live longer and healthier lives? A brief history of AI in Medicine, and the factors that may help it succeed where it has failed before.
It seems like AI, data science, machine learning and bots are some of the most discussed topics in tech today. Given my company Fuzzy.
Scikit-learn is an open source Python library that implements a range of machine learning, preprocessing, cross-validation and visualization algorithms using a unified interface. Your data needs to be numeric and stored as NumPy arrays or SciPy sparse matrices.
Update: This article is part of a series. Check out the full series: Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7 and Part 8! You can also read this article in 普通话, Русский, 한국어, Tiếng Việt or Italiano.
Since the 1860s, when they first appeared in the lobbies of plush hotels, lifts have changed the world. Before the lift, buildings were generally no higher than about seven floors. After the lift, we had skyscrapers.
I’ve just finished Week 5 of the Coursera/Stanford Machine Learning course. It has been a mixture of refreshing, relearning, and new for me. I had already been using, building, and researching/evaluating machine learning algorithms for a number of years.
Containers are already adding value to our proven globally available cloud platform based on Amazon EC2 virtual machines. We’ve shared pieces of Netflix’s container story in the past (video, slides), but this blog post will discuss containers at Netflix in depth.
A list of all named GANs!Every week, new papers on Generative Adversarial Networks (GAN) are coming out and it’s hard to keep track of them all, not to mention the incredibly creative ways in which researchers are naming these GANs! You can read more about GANs in this Generative Models post by O
If you had asked 22-year-old me what my “career aspirations” were, I would have looked at you blankly and then casually changed the subject to what programs you’d recommend to model cute 3D bunnies for a video game, or whether the writers of Alias would be so devious as to ship Sydney Bristow
Data science and machine learning have long been interests of mine, but now that I’m working on Fuzzy.ai and trying to make AI and machine learning accessible to all developers, I need to keep on top of all the news in both fields. My preferred way to do this is through listening to podcasts.
Little things become big things. When you justify and allow even little things into your life which your intuition warns you against, you permit a virus to enter your life. It spreads to other areas.
Everything has to be a billion dollar idea that changes the world or it isn’t worth doing. Reality is, that for every thoughtfully articulated and executed world domination master plan, most of the biggest and impactful companies started out with much more humble ambitions.
In our previous posts about Netflix personalization, we highlighted the importance of using both data and algorithms to create the best possible experience for Netflix members. We also talked about the importance of enriching the interaction and engaging the user with the recommendation system.
Why it’s important? How long does it take to build an app? What if it’s for two platforms at a time? What if you have to do it all by yourself? Moreover, you take up a challenge to build such an app using technologies that you never worked with before.
Sound familiar? Looking back, I realize I used my work to try and fill a void in myself. The problem was that this void was like a black hole. No matter how many hours I worked, it never seemed to fill it up. If anything, it made me feel worse.