Abstract
In a context of ever-increasing traffic, a degradation of the optical layer could affect client demands, in particular the quality of service provided by telecommunications operators. Thus, the rapid detection and prediction of performance degradations occurring in the optical lightpath could help to minimize errors in the network. This paper proposes a failure detection model, equivalent to a performance degradation detection model, but based on machine learning (ML) techniques, namely, the interquartile range (IQR) and the support vector machine (SVM) methods. Note that this model is built from performance metrics monitored on real optical lightpaths. In addition, our model can both label the anomalies to be defined on the data and capture the features that will be used. Feature engineering is explored using three ML techniques, namely the Boruta algorithm, the Random Forest classifier and the recursive feature elimination (RFE), to select the most useful features for the implementation of the model. Tested on monitored performance metrics, the validation phase shows that the model using the RFE method gives us the best results with an F1-score and a recall of 99.51% and 100%, respectively. These results prove the model's ability to detect in advance the degradation of the performance of the network.
PDF Article
More Like This
Cited By
You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.
Contact your librarian or system administrator
or
Login to access Optica Member Subscription