In my last blog post, I detailed the implementation of machine learning models in iOS applications using the Core ML and Vision frameworks. As you probably remember from the tutorial, I implemented the Inception v3 model to give the app the ability to classify 1,000 common objects in the world. While it is true that you can easily download the model from a Github repository, have you ever wonder where it came from? In this blog post, I will introduce the “brain” behind the Inception v3 model––an artificial neural network (ANN).
Recently, I have been experimenting with CoreML, the machine learning framework for Apple’s mobile and desktop operating systems. Rather than continue my discussion of linear regression, I will detail the implementation of a model with CoreML in this blog post.
You might remember linear regression from statistics as a method to produce a linear equation that models the relationship between two variables. Not surprisingly, linear regression is quite similar in machine learning, except that the focus is on the prediction rather than the interpretation of data. Regression is a supervised learning algorithm (if you remember from my previous blog) that predicts real-valued output when given an input. In this blog post, I will discuss the model representation of simple linear regression and introduce its cost function.
As we discussed in the previous post, machine learning is one of the main branches of artificial intelligence, in which we aim to build a rational agent. Machine learning is essential to implementation of artificial intelligence, for it allows agents to adapt to different scenarios, as well as predict changes in evolving environment around them. Continue reading
I hope you aren’t counting on another computer science blog this week; because this won’t be one. I’m here to talk about totally unrelated things and AMC’s Halt and Catch Fire. I promise you, at least I hope, it will makes sense.
I am a horrendous cook- despite classes, Anthony Bourdain’s entire body of work, and repeated attempts at cooking family dinner, I am still banned from cooking at home. However, if Julia Child taught me anything, it’s that following a recipe can get you an (almost) edible meal. In Computer Science we talk about algorithms as our recipe cards, sets of instructions (often equations) that take a bunch of ingredients and give us our meal.
There were a lot of things we couldn’t do with our hundred pound hunk of aluminum, bolts, and old scrapped car motors; however losing didn’t seem to be one of them. They say you can’t ‘fake it ’til you make it’; however after making it to the World Championships of FRC Robotics with a dysfunctional heap of poor planning and design, I began to wonder if you really could. Our robot lurched across the field as if it was on its death bed every match, looked vaguely like a pile of scrap and had an affinity for unintentional right turns. However no matter how bad we thought we were, here we were at the World Championships; and that meant hours onto hours of individual match planning and strategy.
Not with a bang but a whimper. CS50 is now over, and I am thankful I was there for it’s final moments. Chance led me to being in Boston the date’s of the final lecture, and with a short Uber ride twelve miles north, I found myself in the halls of Sander’s Theater.
Welcome to Week Eight. This will be CS50s last week as a traditional course, this Friday I will turn in my last problem set and my assigned coursework will be over. What will follow is four weeks of free time in which I will have to complete my final project. Instead of walking through the last legs of C$50 Finance, which tedium reached nearly funny levels, I figure our time will be better spent looking ahead at the final project.
This week has ended in little progress, but some important decisions were made. There’s a pivotal time to follow, as my own future and that of the Grazer will soon be decided.