Upload
della-joseph
View
217
Download
2
Embed Size (px)
Citation preview
ECE 539 Final Project
ANN approach to help manufacturing of a better car
Prabhdeep Singh VirkFall 2010
Car buying process
• Read reviews, consumer reports from various news agencies.
• Consider rankings provided by US News, JD Power etc.
• Ask colleagues and friends for recommendation.
Good car – owner’s perspective
• Exterior and interior design?
• Features like acceleration, speed, fuel economy etc?
• Safety features ?
• Reliability ?
• Overall Price ?
How to make Good car ?
• Need to know what features are making it a good car.
• Predict what are car consumers want and expectations?
• Possible features/design responsible for high ranking.
• Changes/improvements that can affect the overall ranking of car.
When expectation don’t match?• Car company loose customers due to lack of interest
in their product.• Decline in sales cause catastrophic effects in terms
of loosing jobs and revenue and effecting economy. • In fact failing to innovate and declining sales over
past decade was two major cause of automotive industry crisis.
• In this project I try to implement reverse mapping of accurately predicting the car success based on features using ANN.
• Ann algorithms are proven very successful in pattern classification based problems.
Pattern Classification using ANN• Car Evaluation data from UC-Irvine data repository.• 6 Car features
– Price Over all.• Buying price.• Maintenance price
– Technical characteristics• # of doors • Capacity • Luggage boot size• Safety
• 4 output classes.– Unacceptable (1210 ) – Acceptable ( 384 )– Good ( 69 )– Very good. ( 65 )
Algorithms tested:
• K Nearest Neighbors
•Multi-layered Precptron
K Nearest Neighbor implementation.• Tested with 1 – 15 neighbors• Increasing # of neighbors
have adverse effect.
# of Neighbors
Confusion Matrix
Error Rate.
1 132 142 28 0 45 40 13 0 8 9 1 0 6 8 0 0
40.0463
15 302 0 0 0 98 0 0 0 18 0 0 0 14 0 0 0
69.9074
Multi-layered Precptron implementation.Data Pre-processing:• Scaling input features on [-5,5] scale.• Random train/test datasets, with fixed
minimum samples(10) / class.
MLP configuration:• Epochs = 1000• Learning rate = 0.05• Momentum = 0.8• # of hidden layers = 2• # of neurons/ hidden layer = 6• Steepest Descent Gradient.
Results after 10 iterations:
• Success Rate (%)= 90.5093 - 95.6019• Mean success rate(%) = 92.4306• Standard Deviation(%)= 1.4355
Resultant Confusion Matrix:
294 7 1 0 5 87 5 1 0 4 12 2 0 3 1 10
Error reflecting role of learning rate and momentum.
Conclusion• K-Nearest neighbor is ineffective due to the
difference in class distribution.• MLP performed well, as long as it is trained with
at least 10 samples of each class.• Feature scaling improves classification rate.• Classification rate improves with increase in
neurons.• Momentum helps converging faster.• High learning rate >0.5 case Error to
oscillate.• Its possible to predict a car success ranking
based on the features available.
Questions?
Thank you