In the past post, we discussed stacks and queues as data structures, their major operations along with time and space complexity. To implement them we used list /array using python. If you don’t have any idea about stacks and queue, here is a link to the articles. Link1 Link2
Arrays are used in scenarios where memory utilization of the resources is continuous. But in cases where memory allocation can’t be continuous or the chance of insertion /removal of elements from the middle of the list is frequent, this might lead to shuffling of the elements or increased time of operation.
In my last post, I discussed how you could easily create an end to end recommendation engine using Amazon Sagemaker.
Today we will leverage the Google Cloud Platform and Apache Spark to create a recommendation engine that you can easily integrate with the data engineering pipeline. You can use the same mechanism and deploy it on AI platform notebooks.
Before we dive deep, let’s discuss a few terms.
It’s a collaborative filtering technique where…
Hope you guys are having fun learning python and getting acquainted with data-structures using python. Soon we will start designing algorithms using python but let’s build our fundamentals first.
In the last session I discussed why it is important to understand data structures and algorithms, what are the basic data-structures and how can we use those basic data-structures to build complex ones(eg. stack). We also discussed the time and space complexity. If you haven’t checked that post and are new to data-structures, feel free to check that post here.
Today we will develop data-structures that you already have implemented in…
For folks who have started learning to program or willing to learn and pursue their career in fields that require programming knowledge, this post will help you guys understand the fundamentals of data structures and algorithms.
Data-structures and algorithms are one of the most fundamental aspects of programming taught in any computer science lecture. You can learn any programming language, but to be a good programmer, you have to develop a grip over data-structures and algorithms. So what is a data structure, what is an algorithm, why should we learn it?
A data structure, as the name suggests, is the…
I hope you are doing well. It’s been a late post as I struggled through my health. I have managed to upgrade my skills by learning new technologies and platforms and would be eager to share my knowledge with you all. Without wasting any more time, let’s learn something new today.
Cloud computing has been around for a while, but the pace of adoption has improved exponentially nowadays thanks to the pandemic and IT companies realizing the benefits and trade-offs. There are 3 major cloud services providers: AWS, Microsoft, and GCP.
Google cloud computing services had been around for a…
How many times have you visited any shopping website and purchased anything, have you noticed that the website personalizes as per your purchase history?
You all must have watched YouTube videos. Does the app start suggesting videos in a certain fashion as per the type of content you watch?
What about watching a movie or browsing the internet, does the ads look familiar?
Recommendation engines are one of the most important applications of machine learning, they have changed how businesses interact with their customers. …
Chabot term was coined from the original term “ChatterBot” created back in 1994, the term itself says a machine that can do human conversations. Although it's interesting to learn about chatbot the actual power lies in solving actual business world use-cases where you can automate most of the manual labor done physically by human beings.
Imagine you interacting with a smart speaker today booking a flight or ordering food for you, nobody would have thought doing these things with the same convenience a decade back. Imagine the customer support today with automated chatbots handling customers with automated replies. How much…
The objective of this post is to guide you through building an end to end machine learning pipeline involving deep learning and object detection using RESNET-50 architecture using AWS cloud computing service SageMaker. We will also discover how we can use Amazon Sagemaker Ground Truth to label large datasets within minutes.
The post will cover all the major aspects of the machine learning development lifecycle:
Before we even start downloading our dataset, we will spin up a sagemaker notebook instance: we can do so
I hope you all are doing well. In this article, I would like to share my experience of passing the AWS machine learning specialty certification exam. The objective of this post is to help folks who would like to pursue their career in the field of data science or machine learning and want to showcase their interests. At the end of this post, I will highlight a couple of points regarding this certification exam, which you can use as a hint about whether or not you should go for the certification.
Building and managing data science or machine learning pipeline requires working with different tools and technologies, right from data collection phase to model deployment and monitoring. It involves knowledge of software engineering, data science, data engineering, data analysis and devops and cloud computing. If you have skilled people for each one of them then congratulations, you have saved yourself from some hassle.
But I personally enjoy working and building end to end flows and managing the whole process becomes cumbersome most of the time.