Brute Force all scikit-learn models and all scikit-learn parameters with **fit** **predict**.

```
from hunga_bunga import HungaBungaClassifier, HungaBungaRegressor
```

Yes.

Many believe that

most of the work of supervised (non-deep) Machine Learning lies in feature engineering, whereas the model-selection process is just running through all the models or just take xgboost.

So here is an automation for that.

Runs through all `sklearn`

models (both classification and regression), with **all possible hyperparameters**, and rank using cross-validation.

Runs **all the model** available on `sklearn`

for supervised learning here. The categories are:

- Generalized Linear Models
- Kernel Ridge
- Support Vector Machines
- Nearest Neighbors
- Gaussian Processes
- Naive Bayes
- Trees
- Neural Networks
- Ensemble methods

Note: Some models were dropped out (nearly none of them..) and some crash or cause exceptions from time to time. It takes REALLY long to test this out so clearing exceptions took me a while.

```
pip install hunga-bunga
```

Dependencies

```
- Python (>= 2.7)
- NumPy (>= 1.11.0)
- SciPy (>= 0.17.0)
- joblib (>= 0.11)
- scikit-learn (>=0.20.0)
- tabulate (>=0.8.2)
- tqdm (>=4.28.1)
```

As any other sklearn model

```
clf = HungaBungaClassifier()
clf.fit(x, y)
clf.predict(x)
```

And import from here

```
from hunga_bunga import HungaBungaClassifier, HungaBungaRegressor
```

As any other sklearn model

```
clf = HungaBungaClassifier(brain=True)
clf.fit(x, y)
```

The output looks this:

Model | accuracy | Time/clf (s) |
---|---|---|

SGDClassifier | 0.967 | 0.001 |

LogisticRegression | 0.940 | 0.001 |

Perceptron | 0.900 | 0.001 |

PassiveAggressiveClassifier | 0.967 | 0.001 |

MLPClassifier | 0.827 | 0.018 |

KMeans | 0.580 | 0.010 |

KNeighborsClassifier | 0.960 | 0.000 |

NearestCentroid | 0.933 | 0.000 |

RadiusNeighborsClassifier | 0.927 | 0.000 |

SVC | 0.960 | 0.000 |

NuSVC | 0.980 | 0.001 |

LinearSVC | 0.940 | 0.005 |

RandomForestClassifier | 0.980 | 0.015 |

DecisionTreeClassifier | 0.960 | 0.000 |

ExtraTreesClassifier | 0.993 | 0.002 |

*The winner is: ExtraTreesClassifier with score 0.993.*

Get A Weekly Email With Trending Projects For These Topics

No Spam. Unsubscribe easily at any time.

python (53,503)

learning (359)

scikit-learn (204)

kaggle (109)

automl (105)

sklearn (71)

machine (44)