Inhaltsverzeichnis

Alle Kapitel aufklappen
Alle Kapitel zuklappen
1 Introduction
15
1.1 What Does This Book Offer?
15
1.2 What Is Artificial Intelligence?
17
1.3 The History of AI: A Brief Overview
18
1.4 Development Tools Used in This Book
20
1.4.1 Python
20
1.4.2 Jupyter Notebook
22
1.4.3 KNIME
22
1.4.4 ChatGPT and GPT-4
23
1.4.5 DALL-E 2 or DALL-E 3
24
2 Installation
25
2.1 Anaconda Distribution
25
2.1.1 Windows and macOS
26
2.1.2 Linux
26
2.1.3 Configuration and Test
27
2.2 KNIME
30
2.2.1 Installation
31
2.2.2 Configuration
34
2.2.3 Test
37
3 Artificial Neural Networks
39
3.1 Classification
40
3.2 The Recipe
41
3.2.1 Data Preparation
42
3.2.2 Building Up the AI
43
3.2.3 Training the AI
43
3.2.4 Testing the AI
44
3.2.5 Using AI
45
3.3 Building ANNs
45
3.4 Structure of an Artificial Neuron
47
3.5 Feed Forward
48
3.6 Back Propagation
51
3.7 Updating the Weights
53
3.8 ANN for Classification
55
3.9 Hyperparameters and Overfitting
63
3.10 Dealing with Nonnumerical Data
65
3.11 Dealing with Data Gaps
67
3.11.1 Filling Empty Cells with Data
68
3.11.2 Removing Rows with Empty Cells
68
3.12 Correlation versus Causality
69
3.13 Standardization of the Data
76
3.14 Regression
78
3.15 Deployment
81
3.15.1 Training, Testing, and Saving
81
3.15.2 Using the ANN Model
83
3.16 Exercises
85
3.16.1 Exercise 1: Hyperparameter Optimization for Classification
85
3.16.2 Exercise 2: Hyperparameter Optimization for Regression
86
3.16.3 Exercise 3: ANN for Classification
86
3.16.4 Exercise 4: ANN for Regression
87
4 Decision Trees
89
4.1 Simple Decision Trees
90
4.1.1 Decision Tree Classifier
90
4.1.2 Decision Tree Regressor
96
4.1.3 Decision Forests
99
4.1.4 Random Forest Classifier
99
4.1.5 Random Forest Regressor
100
4.2 Boosting
100
4.2.1 Gradient Boosting
100
4.2.2 XGBoost Classifier
103
4.2.3 Automatic Hyperparameter Setting Using GridSearchCV
107
4.3 XGBoost Regressor
109
4.4 Deployment
110
4.5 Decision Trees Using Orange
111
4.6 Exercises
115
4.6.1 Exercise 1: XGBoost for Classification
115
4.6.2 Exercise 2: XGBoost for Regression
116
4.6.3 Exercise 3: Automatic Hyperparameter Optimization
116
5 Convolutional Layers and Images
117
5.1 Simple Image Classification
118
5.2 Hyperparameter Optimization Using Early Stopping and KerasTuner
123
5.3 Convolutional Neural Network
128
5.4 Image Classification Using CIFAR-10
134
5.5 Using Pretrained Networks
137
5.6 Exercises
140
5.6.1 Exercise 1: Hyperparameter Optimization for CIFAR-10
140
5.6.2 Exercise 2: Pretrained VGG19 Model
140
6 Transfer Learning
141
6.1 How It Works
143
6.2 Exercises
150
6.2.1 Exercise 1: Rock-Paper-Scissors
150
6.2.2 Exercise 2: Human or Horse
150
7 Anomaly Detection
151
7.1 Unbalanced Data
152
7.2 Resampling
156
7.3 Autoencoders
158
7.4 Exercises
164
7.4.1 Exercise 1: Anomaly Detection Using XGBoost and Upsampling
164
7.4.2 Exercise 2: Anomaly Detection Using an Autoencoder
164
8 Text Classification
165
8.1 Embedding Layer
165
8.2 GlobalAveragePooling1D Layer
168
8.3 Text Vectorization
170
8.4 Analysis of the Relationships
173
8.5 Classifying Large Amounts of Data
177
8.6 Exercises
180
8.6.1 Exercise 1: Hyperparameter Optimization
180
8.6.2 Exercise 2: Text Classification
180
8.6.3 Exercise 3: Text Classification Using Upsampling
180
9 Cluster Analysis
181
9.1 Graphical Analysis of the Data
182
9.2 The k-Means Clustering Algorithm
186
9.3 The Finished Program
189
9.4 Exercises
192
9.4.1 Exercise 1: Grouping of Diamonds
192
9.4.2 Exercise 2: Grouping of Mushrooms
192
10 AutoKeras
193
10.1 Classification
194
10.2 Regression
195
10.3 Image Classification
196
10.4 Text Classification
199
10.5 Exercises
202
10.5.1 Exercise 1: Classification
202
10.5.2 Exercise 2: Regression
202
10.5.3 Exercise 3: Image Classification
202
10.5.4 Exercise 4: Text Classification
202
11 Visual Programming Using KNIME
203
11.1 Simple ANNs
204
11.1.1 Classification
204
11.1.2 Classification Using Python Node
216
11.1.3 Regression
218
11.1.4 Regression Using Python Node
221
11.2 XGBoost
223
11.2.1 Classification
223
11.2.2 Deployment
225
11.2.3 Regression
226
11.3 Image Classification Using a Pretrained Model
227
11.3.1 Image Classification Using Keras Node
227
11.3.2 Image Classification Using Python Node
231
11.4 Transfer Learning
232
11.4.1 Transfer Learning Using Keras Node
232
11.4.2 Transfer Learning Using Python Node
235
11.5 Autoencoder
237
11.5.1 Autoencoder with Keras Node
238
11.5.2 Autoencoder with Python Node
242
11.6 Text Classification
245
11.6.1 Text Classification with Keras Node
245
11.6.2 Text Classification with Python Node
247
11.7 AutoML
249
11.7.1 Installation
249
11.7.2 Classification
250
11.8 Cluster Analysis
253
11.8.1 Manual Cluster Setting
253
11.8.2 Cluster Setting with a Loop
254
11.9 Time Series Analysis
257
11.9.1 Recurrent Neural Networks
257
11.9.2 Long Short-Term Memory
259
11.9.3 Prediction of Energy Consumption (Next Hour) Using Keras Node
260
11.9.4 Prediction of Energy Consumption (Next Hour) Using Python Node
265
11.9.5 Prediction of Energy Consumption (Next 500 Hours) Using Keras Node
267
11.9.6 Prediction of Energy Consumption (Next 500 Hours) Using Python Node
269
11.10 Text Generation
271
11.10.1 Data Preparation
272
11.10.2 Trainings
274
11.10.3 Generation
274
11.11 Further Information on KNIME
277
11.12 Exercises
278
11.12.1 Exercise 1: XGBoost for Classification, Mushrooms
278
11.12.2 Exercise 2: XGBoost for Regression, Diamonds
278
11.12.3 Exercise 3: Image Classification Using InceptionV3
278
11.12.4 Exercise 4: Transfer Learning, Horses or Humans
278
11.12.5 Exercise 5: Anomaly Detection Using an Autoencoder, ECG
278
11.12.6 Exercise 6: Text Classification
278
11.12.7 Exercise 7: AutoML for Regression
279
11.12.8 Exercise 8: Cluster Analysis
279
11.12.9 Exercise 9: Time Series Analysis
279
11.12.10 Exercise 10: Text Generation
279
12 Reinforcement Learning
281
12.1 Q-Learning
282
12.2 Python Knowledge Required for the Game
287
12.2.1 Lists
287
12.2.2 Branches
288
12.2.3 Loops
289
12.2.4 Random Choice
290
12.2.5 Functions
291
12.3 Trainings
292
12.4 Test
294
12.5 Outlook
295
12.6 Exercises
296
12.6.1 Exercise 1: Hyperparameters
296
12.6.2 Exercise 2: Expansion of the Game
296
13 Genetic Algorithms
297
13.1 The Algorithm
298
13.1.1 Start Generation
299
13.1.2 Selection
299
13.1.3 Reproduction
300
13.1.4 Mutation
300
13.1.5 New Generation
301
13.2 Example of a Sorted List
301
13.3 Example of Equation Systems
304
13.4 Real-Life Sample Application
306
13.5 Exercises
309
13.5.1 Exercise 1: Hyperparameter Optimization
309
13.5.2 Exercise 2: System of Equations
309
14 ChatGPT and GPT-4
311
14.1 Prompt Engineering
313
14.1.1 Generating Content
314
14.1.2 Programming
318
14.1.3 Analyzing and Summarizing
324
14.1.4 Final Questions for ChatGPT
326
14.2 The ChatGPT Programming Interface
328
14.2.1 Application Programming Interface Key and First Program
329
14.2.2 Parameters
331
14.2.3 Input Filters
334
14.2.4 Roles
337
14.2.5 Memory
339
14.2.6 User Profiles
340
14.2.7 Playground
341
14.2.8 Speech to Text
341
14.3 Exercise 1: Math Support
344
15 DALL-E and Successor Models
345
15.1 DALL-E 2
345
15.1.1 Prompt Engineering
346
15.1.2 Editing Generated Images
347
15.2 DALL-E 3
350
15.3 Programming Interface
352
15.3.1 Image Creation
352
15.3.2 Image Variations
354
15.3.3 Image Processing
355
15.4 Exercise 1: DALL-E API with Moderation
357
16 Outlook
359
Appendices
361
A Exercise Solutions
363
A.1 Chapter 3
363
Exercise 1: Hyperparameter Optimization for Classification
363
Exercise 2: Hyperparameter Optimization for Regression
364
Exercise 3: ANN for Classification
365
Exercise 4: ANN for Regression
367
A.2 Chapter 4
368
Exercise 1: XGBoost for Classification
368
Exercise 2: XGBoost for Regression
369
Exercise 3: Automatic Hyperparameter Optimization
370
A.3 Chapter 6
371
Exercise 1: Rock-Paper-Scissors
371
Exercise 2, Part 1: Human or Horse, Training and Testing
371
Exercise 2, Part 2: Human or Horse, Application
372
A.4 Chapter 7
373
Exercise 1: Anomaly Detection Using XGBoost and Upsampling
373
Exercise 2: Anomaly Detection Using an Autoencoder
374
A.5 Chapter 8
376
Exercise 1: Hyperparameter Optimization
376
Exercise 2: Text Classification
376
Exercise 3: Text Classification Using Upsampling
377
A.6 Chapter 9
379
Exercise 1: Grouping of Diamonds
379
Exercise 2: Grouping of Mushrooms
380
A.7 Chapter 10
381
Exercise 1: Classification
381
Exercise 2: Regression
381
Exercise 3: Image Classification
382
Exercise 4: Text Classification
383
A.8 Chapter 11
384
Exercise 1: XGBoost for Classification, Mushrooms
384
Exercise 2: XGBoost for Regression, Diamonds
384
Exercise 3: Image Classification Using InceptionV3
385
Exercise 4: Transfer Learning, “Human or Horse”
385
Exercise 5: Anomaly Detection Using an Autoencoder: ECG
386
Exercise 6: Text Classification
386
Exercise 7: AutoML for Regression
387
Exercise 8: Cluster Analysis
387
Exercise 9: Time Series Analysis
388
Exercise 10: Text Generation
388
A.9 Chapter 12
389
Exercise 1: Hyperparameters
389
Exercise 2: Expansion of the Game
389
A.10 Chapter 13
390
Exercise 1: Hyperparameter Optimization
390
Exercise 2: System of Equations
390
A.11 Chapter 14
392
Exercise 1: Math Support
392
A.12 Chapter 15
393
Exercise 1: DALL-E API with Moderation
393
B References
395
C The Author
397
Index
399