Label Encoder | Different Types of Feature Engineering Encoding Techniques | sklearn | TeKnowledGeek
In this video, I will show you how to do encoding using the label encoding method to convert the categorical data to numerical data.
1. Why do we need to convert categorical data to numerical data?
2. How can we convert the categorical data with high cardinality?
3. What are the pros and cons of label encoding?
Our machine learning playlist: https://www.youtube.com/playlist?list=PL7e7n_-0r6CG3C4e2UNCJyyDNjxYoOC49
Channel link: https://www.youtube.com/c/TeKnowledGeeK
LinkedIn profile: https://www.linkedin.com/in/akshay-mukhopadhyay-b3585ba2
Related video title :
Iterative Imputer | how to handle missing data machine learning | TeKnowledGeek
related tags :
python
analytics
machine learning
imputer
dropna
fillna
features
target
dependent variables
independent variables
sklearn
imputer sklearn example
sklearn imputer
machine learning preprocessing
sklearn.preprocessing
imputer
imputer find the missing values in python
imputer.fit
fit_transform
imputer on missing values python
python imputer example
imputer sklearn
imputer python
how to use imputer
imputer function in python
normalization in feature scaling
code with akshay
how to do feature scaling in machine learning
reference:
https://towardsdatascience.com/categorical-encoding-using-label-encoding-and-one-hot-encoder-911ef77fb5bd
thumbnail photo by: https://unsplash.com/photos/YKW0JjP7rlU?utm_source=unsplash&utm_medium=referral&utm_content=creditShareLink
tags : #LabelEncoding #FeatureScaling #FeatureEngineering #Data_Preprocessing #machine_learning #teknowledgeek