Data Science (Incl AI/ML)

Tuesday, October 29, 2019

General ML Notes






Resources

Model Building
  1. Dummy Variables
  2. Why it's necessary to create dummy variables 
  3. When to Normalise date and when to standardise? 
  4. Scaling techniques
Model Comparison (R-squared normalisation techniques)
  1. AIC (Akaike Information Criterion)
  2. BIC (Bayesian Information Criterion)
  3. Mallow's Cp
Feature Selection
  1. univariate methods
  2. linear models and regularization
  3. random forests
  4. stability selection, rfe and the above compared






Posted by Viswajit "Vish" Iyer at 4:07 AM
Email ThisBlogThis!Share to XShare to FacebookShare to Pinterest
Labels: #aiml, #featureselection, #modelbuilding, #modelcomparison

No comments:

Post a Comment

Newer Post Older Post Home
Subscribe to: Post Comments (Atom)

About Me

Viswajit "Vish" Iyer
View my complete profile

Blog Archive

  • ►  2020 (3)
    • ►  August (1)
    • ►  February (1)
    • ►  January (1)
  • ▼  2019 (28)
    • ►  December (2)
    • ►  November (7)
    • ▼  October (19)
      • SVM (Support Vector Machines) Modelling
      • General ML Notes
      • Exploratory Data Analysis
      • Hypothesis Testing
      • Inferential Stats
      • Resources
      • Linear Regression
      • Logistic Regression
      • Clustering
      • Confusion Matrix
      • Principal Component Analysis
      • Deep Learning/Neural Network Applications
      • Natural Language Processing
      • Tree Models
      • Neural Network Notational Symbols
      • Recurrent Neural Networks (RNN)
      • Python code snippets for RNNs
      • Rasa chatbot framework
      • Winning Kaggle Competitions
Simple theme. Powered by Blogger.