Admin مدير المنتدى
عدد المساهمات : 18996 التقييم : 35494 تاريخ التسجيل : 01/07/2009 الدولة : مصر العمل : مدير منتدى هندسة الإنتاج والتصميم الميكانيكى
| موضوع: كتاب Analysis and Design of Machine Learning Techniques الثلاثاء 22 يونيو 2021, 10:25 pm | |
|
أخوانى فى الله أحضرت لكم كتاب Analysis and Design of Machine Learning Techniques Evolutionary Solutions for Regression, Prediction, and Control Problems Patrick Stalph
و المحتوى كما يلي :
Contents 1 Introduction and Motivation 1 1.1 How to Learn Motor Skills? . 2 1.2 The Robotics Viewpoint . 4 1.3 From the View of Cognitive Science 5 1.4 Requirements for Plausible and Feasible Models 6 1.5 Scope and Structure . 7 I Background 9 2 Introduction to Function Approximation and Regression 11 2.1 Problem Statement 11 2.2 Measuring Quality 12 2.3 Function Fitting or Parametric Regression . 13 2.3.1 Linear Models with Ordinary Least Squares 13 2.3.2 Online Approximation with Recursive Least Squares . 15 2.4 Non-Parametric Regression . 17 2.4.1 Interpolation and Extrapolation 17 2.4.2 Gaussian Process Regression 19 2.4.3 Artificial Neural Networks 20 2.5 Local Learning Algorithms . 24 2.5.1 Locally Weighted Projection Regression 25 2.5.2 XCSF – a Learning Classifier System 26 2.6 Discussion: Applicability and Plausibility . 26 3 Elementary Features of Local Learning Algorithms 29 3.1 Clustering via Kernels 30 3.1.1 Spherical and Ellipsoidal Kernels 32 3.1.2 Alternative Shapes 34 3.2 Local Models . 35 3.3 Inference as a Weighted Sum 36 3.4 Interaction of Kernel, Local Models, and Weighting Strategies 37X Contents 4 Algorithmic Description of XCSF 41 4.1 General Workflow . 41 4.2 Matching, Covering, and Weighted Prediction . 43 4.3 Local Model Adaptation . 43 4.4 Global Structure Evolution . 45 4.4.1 Uniform Crossover and Mutation 47 4.4.2 Adding new Receptive Fields and Deletion . 48 4.4.3 Summary . 48 4.5 Relevant Extensions to XCSF 50 4.5.1 Subsumption . 50 4.5.2 Condensation and Compaction . 52 II Analysis and Enhancements of XCSF 55 5 How and Why XCSF works 57 5.1 XCSF’s Objectives 57 5.2 Accuracy versus Generality . 58 5.3 Coverage and Overlap 59 5.4 Three Phases to Meet the Objectives 60 6 Evolutionary Challenges for XCSF 63 6.1 Resource Management and Scalability . 64 6.1.1 A Simple Scenario 64 6.1.2 Scalability Theory 66 6.1.3 Discussion . 67 6.1.4 Empirical Validation . 68 6.1.5 Structure Alignment Reduces Problem Complexity 69 6.1.6 Summary and Conclusion 71 6.2 Guided Mutation to Reduce Learning Time 72 6.2.1 Guiding the Mutation 73 6.2.2 Experimental Validation . 77 6.2.3 Experiment 2: A 10D Sine Wave 79 6.2.4 What is the Optimal Guidance Probability? 81 6.2.5 Summary and Conclusion 82 III Control Applications in Robotics 85 7 Basics of Kinematic Robot Control 87 7.1 Task Space and Forward Kinematics 88Contents XI 7.2 Redundancy and Singularities 90 7.2.1 Singularities 91 7.3 Smooth Inverse Kinematics and the Nullspace . 92 7.3.1 Singular Value Decomposition 93 7.3.2 Pseudoinverse and Damped Least Squares . 94 7.3.3 Redundancy and the Jacobian’s Nullspace . 96 7.4 A Simple Directional Control Loop . 98 8 Learning Directional Control of an Anthropomorphic Arm 101 8.1 Learning Velocity Kinematics 102 8.1.1 Learning on Trajectories . 104 8.1.2 Joint Limits 105 8.2 Complete Learning and Control Framework 105 8.3 Simulation and Tasks 107 8.3.1 Target Generation 107 8.4 Evaluating Model Performance . 108 8.5 Experiments 110 8.5.1 Linear Regression for Control 111 8.5.2 RBFN . 114 8.5.3 XCSF . 115 8.5.4 LWPR . 117 8.5.5 Exploiting Redundancy: Secondary Constraints 118 8.5.6 Representational Independence . 119 8.6 Summary and Conclusion 122 9 Visual Servoing for the iCub 125 9.1 Vision Defines the Task Space 125 9.1.1 Reprojection with Stereo Cameras . 127 9.2 Learning to Control Arm and Head . 130 9.3 Experimental Validation . 131 10 Summary and Conclusion 137 10.1 Function Approximation in the Brain? . 137 10.2 Computational Demand of Neural Network Approximation . 138 10.3 Learning Motor Skills for Control 139 10.3.1 Retrospective: Is it Cognitive and Plausible? . 141 10.3.2 On Optimization and Inverse Control . 142 10.4 Outlook 142 A A one-dimensional Toy Problem for Regression Bibliography 155 145List of Figures 1.1 Planar arm with two links 2 1.2 Kinematic Redundancy with three links 3 1.3 The iCub robot 4 2.1 Toy problem illustrates the task of function approximation 12 2.2 Ordinary least squares model 15 2.3 Iterative updates of a recursive least squares model 16 2.4 Interpolation . 18 2.5 Gaussian kernel 19 2.6 Gaussian process regression example 20 2.7 Radial basis function network 22 2.8 RBFN trained on toy data 24 3.1 Possible kernels for RBFNs . 31 3.2 Axis aligned and general ellipsoidal kernels 33 3.3 Geometric illustration of an ellipsoidal transformation 33 3.4 Axis aligned rectangular kernel . 35 3.5 Interaction of clustering and local models . 38 3.6 Exponential weighting of local models . 39 4.1 XCSF workflow 42 4.2 Illustration of subsumption . 51 4.3 Subsumption with general ellipsoids 51 5.1 Balance of pressures . 59 5.2 Illustration of deletion probability . 60 5.3 Typical XCSF performance graph 61 6.1 Interplay between function, model, and clustering . 64 6.2 Assumptions for the scalability model . 65 6.3 Approximation error for the simple scalability scenario 66 6.4 Optimal receptive field volume and population size 68 6.5 Empirical validation of the scalability theory . 70 6.6 A linear function poses a checkerboard problem for XCSF 71 6.7 Appropriate representations reduce problem complexity . 72XIV List of Figures 6.8 Schematic of a guided mutation . 74 6.9 Benchmark functions for guided XCSF . 78 6.10 Guided XCSF on the 6D sine 79 6.11 Guided XCSF on the 10D sine . 80 6.12 Guided XCSF on 10D crossed ridge function 81 6.13 The optimal guidance probability depends on the problem 82 7.1 Schematic of a planar, two joint arm 88 7.2 Illustration of velocity forward kinematics . 90 7.3 Illustration of redundant joint configurations . 90 7.4 Illustration of a singular configuration . 91 7.5 The singular value decomposition 93 7.6 Different strategies to handle singularities 95 7.7 A velocity profile . 99 7.8 Deadlocks . 100 8.1 Context and prediction input can be separated 103 8.2 The complete learning and control framework . 106 8.3 Schematic of the simulated, anthropomorphic arm 107 8.4 Task generation 109 8.5 Static linear model for inverse kinematic control . 113 8.6 Dynamic linear model for control 114 8.7 RBFN control performance . 116 8.8 XCSF control performance 117 8.9 LWPR control performance . 118 8.10 The effect of secondary constraints . 119 8.11 Performance for the egocentric task space representation . 121 8.12 Pareto ranking for kinematic control performance . 122 9.1 The iCub robot 126 9.2 Different coordinate systems for iCub vision 127 9.3 Stereo camera projection . 128 9.4 The modified control loop for the iCub robot . 132 9.5 Control performance for the iCub robot with vision 133 9.6 Task space trajectories on the asterisk task 134List of Algorithms 4.1 Matching and covering 43 4.2 Weighted Prediction 44 4.3 Local updates of matching receptive fields 46 4.4 Selection, reproduction, crossover, and mutation 46 4.5 Deletion via roulette wheel selection . 49 6.1 Guided mutation 76 Glossary ANN Artificial Neural Network. 21, 22, 27, 138 control space The space of control commands, e.g. torques or joint velocities. 3, 89, 90, 92, 93, 96, 97, 99, 102, 104–109, 115–120, 122, 132 DoF Degrees of Freedom. 3, 7, 8, 87, 89–92, 101, 102, 104, 107, 108, 110, 111, 113, 115, 117, 118, 122, 139 FA Function Approximation. 6, 11–13, 21, 26–28, 41, 77, 101, 102, 106, 107, 131, 137, 138, 142 forward kinematics A mapping from joint configuration to task space location. XVII, 88, 89, 92, 101, 102, 106, 107, 110–112, 114, 115, 117, 118, 122, 125, 132, 140 GA Genetic Algorithm. 26, 29, 41, 42, 45, 46, 48–50, 52, 58–64, 68, 69, 72, 73, 77–82, 101, 104, 115, 116 GPR Gaussian Process Regression. 19, 20, 23, 24, 27, 30, 31 iCub iCub is a humanoid robot. 1, 4, 8, 125, 126, 128–133, 140, 143 inverse kinematics Inversion of the forward kinematics. 88, 92, 96, 97, 130, 142 Jacobian A joint configuration dependent matrix that specifies the effect of joint angle changes on task space state. 89–93, 96, 97, 99–106, 108, 111–113, 130, 131 kernel is a positive semi-definite matrix that defines a distance metric. 19, 21–38, 41, 43, 46, 47, 49, 51, 53, 63, 78, 82, 103, 120, 138XVIII Glossary kinematic chain A number of rigid bodies connected with rotational or translational joints. 2–5, 7, 27, 87–89, 105, 106, 111, 127 LCS Learning Classifier System. 41, 138, 143 LWPR Locally Weighted Projection Regression. 8, 25, 26, 28–30, 36, 53, 64, 69, 71, 72, 101, 102, 104–106, 110, 111, 114, 117, 118, 121–123, 138–141 MAE Mean Absolute Error. 12, 13, 61 ML Machine Learning. 1, 6, 28, 57, 137 MSE Mean Squared Error. 13, 23 nullspace The space of vectors mapped to zero by a matrix. 90–94, 96–98, 118, 142 OLS Ordinary Least Squares. 14, 15, 20, 22, 23, 92 PLS Partial Least Squares. 25 Pseudoinverse also known as Moore-Penrose matrix, denotes the closest solution to a matrix inversion when an exact solution is not possible. 14, 23, 92, 94–97 range The space spanned by all possible linear combinations of the matrix columns. 93 RBFN Radial Basis Function Network. 8, 21–25, 27–30, 36, 37, 41, 53, 64, 69, 71, 72, 83, 101–106, 110, 111, 114– 117, 121–123, 138–141, 143 reprojection Reconstruction of 3D object location from two camera images. 127, 129, 130 RF Receptive Field. 41–53, 57–69, 71–80, 82, 83, 101, 103, 104, 110, 111, 114–117, 120, 121, 132–134, 139– 141, 143 RLS Recursive Least Squares. 15–17, 25–27, 36, 43, 104, 111–116 RMSE Root Mean Squared Error. 13, 15, 38Glossary XIX subsumption Accurate receptive fields may subsume other more specific receptive fields to foster generalization. 50, 52, 53, 58 SVD Singular Value Decomposition. 75, 92, 93, 96, 97, 99, 141 task space The space where tasks are defined, e.g. Cartesian location of the end effector. 3, 5, 89–94, 96–99, 102, 104–112, 115, 118–122, 125, 126, 129–132, 139, 140, 142, 143 XCS Accuracy based Learning Classifier System introduced by S. Wilson, sometimes called eXtended Learning Classifier System. 41, 50, 52, 57, 58 XCSF Function approximation mode of XCS.
كلمة سر فك الضغط : books-world.net The Unzip Password : books-world.net أتمنى أن تستفيدوا من محتوى الموضوع وأن ينال إعجابكم رابط من موقع عالم الكتب لتنزيل كتاب Analysis and Design of Machine Learning Techniques رابط مباشر لتنزيل كتاب Analysis and Design of Machine Learning Techniques
|
|