Negotiated Representations to Prevent Overfitting in Machine Learning Applications
Keywords:
Machine learning, Negotiated representations, OverfittingAbstract
Overfitting is a phenomenon that occurs when a machine learning model is trained for too long and focuses too much on the exact fitness of the training samples to the provided training labels and cannot keep track of the predictive rules that would be useful on the test data. This phenomenon is commonly attributed to memorization of samples, memorization of the noise, and forced fitness into a data set of limited samples by using a high number of neurons. While it is true that the model encodes various peculiarities as the training process continues, it is argued that most of the Overfitting occurs in the process of reconciling sharply defined membership ratios. This study presents an approach that increases the classification accuracy of machine learning models by allowing the model to negotiate output representations of the samples with previously determined class labels. By setting up a negotiation between the model’s interpretation of the inputs and the provided labels, the model not only increased average classification accuracy but also decreased the rate of Overfitting without applying any other regularization methods. By implementing the negotiation paradigm approach to several low regime machine learning problems by generating Overfitting scenarios from publicly available data sets such as CIFAR 10, CIFAR 100, and MNIST it has been demonstrated that the proposed paradigm has more capacity than its intended purpose. Sharing is performed the experimental results and inviting the machine learning community to explore the limits of the proposed paradigm. This work also aims to incentivize the machine learning community to exploit the negotiation paradigm to overcome the learning related challenges in other research fields such as continual learning. The Python code of the experimental setup is uploaded to GitHub(https://github.com/nurikorhan/Negotiated-Representations)

