Syllabus

JNTUK B.Tech Machine Learning (Elective – II) for R13 Batch.

JNTUK B.Tech Machine Learning (Elective – II) gives you detail information of Machine Learning (Elective – II) R13 syllabus It will be help full to understand you complete curriculum of the year.

Course objectives

  • The main objective of this course is for the students to achieve basic knowledge of artificial intelligence, a deepened technical understanding of machine learning research and theories, as well as practical experience of the use and design of machine learning and data mining algorithms for applications and experiments. The course has a strong focus towards applied IT. The student not only learns how to critically review and compare different algorithms and methods, but how to plan, design, and implement learning components and applications and how to conduct machine learning experiments.

Course outcomes

  • The student will be able evaluate and compare the performance or, other qualities, of algorithms for typical learning problems.
  • The student will be able to design a supervised or unsupervised learning system.

Syllabus

UNIT I: Introduction : Well-posed learning problems, Designing a learning system, Perspectives and issues in machine learning. Concept learning and the general to specific ordering – Introduction, A concept learning task, Concept learning as search, Find- S: finding a maximally specific hypothesis, Version spaces and the candidate elimination algorithm, Remarks on version spaces and candidate elimination, Inductive bias.

UNIT II: Linear Regression & Logistic Regression: Predicting numeric values: regression – Finding the best fit lines with linear regression, Locally weighted linear regression, Shrinking Coefficients, The bias / Variance tradeoff. Logistic Regression: Classification with logistic regression and the sigmoid function, Using optimization to find the best regression coefficients.

UNIT III: Artificial Neural Networks: Introduction, Neural network representation, Appropriate problems for neural network learning, Perceptions, Multilayer networks and the back propagation algorithm, Remarks on the back propagation algorithm, An illustrative example face recognition, Advanced topics in artificial neural networks.

UNIT IV: Evaluation Hypotheses: Motivation, Estimation hypothesis accuracy, Basics of sampling theory, A general approach for deriving confidence intervals, Difference in error of two hypotheses, Comparing learning algorithms.

UNIT V: Support vector machines & Dimensionality Reduction techniques: Separating data with the maximum margin, finding the maximum margin, efficient optimization with SMO algorithm, speeding up optimization with full platt SMO, Using Kernels for more Complex data. Dimensionality Reduction techniques: Principal Component analysis, Example.

UNIT VI: Instance-Based Learning- Introduction, k -Nearest Neighbor Learning, Locally Weighted Regression, Radial Basis Functions, Case-Based Reasoning, Remarks on Lazy and Eager Learning.
Genetic Algorithms: Representing Hypotheses, Genetic Operators, Fitness Function and Selection, Illustrative Example.

TEXT BOOKS

  • Machine Learning ,Tom M. Mitchell, MGH
  • Machine Learning in Action, Peter Harington, 2012, Cengage.`

REFERENCE BOOKS

  • Introduction to Machine Learning, Ethem Alpaydin, PHI, 2004

For more information about all JNTU updates please stay connected to us on FB and don’t hesitate to ask any questions in the comment.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.