AIM: The objective of this work is to learn long- and short-term temporal dependencies of data to enhance the early detection of sepsis in ICU patients. Methods: Initially, the data was pre-processed. Missing values of vital signs were imputed using second-order interpolations. Missing laboratory values were replaced using the mean along each variable. One-hot encoding was used for the categorical features. Time series analysis on patient data was done using LSTM networks. To that aim, a sliding window was used to account for the temporal evolution of the patient leading to the eventual diagnosis of the sepsis, that is, the last ‘n’ hours of data were added to each patient sample. The model parameters were estimated by minimizing a cost function through stochastic gradient descent. Instead of the conventional binary cross entropy used in classification problems, an approximation to the Wilcoxon-Mann-Whitney statistic, which is equivalent to the AUC but differentiable, was used as the cost function because data were highly imbalanced. Once minimized, the model showed the probability that a patient had sepsis, so the decision threshold was derived by maximizing the Youden’s index for the ROC curve. Results: The initial 5000 patient dataset was split into training, validation and test sets with a 70-15-15% proportion. n=8 hours was used. Random under-sampling of the training set to a 1:1 ratio was used to prevent overfitting on the majority class. A 3-layer LSTM with 128 nodes each was trained resulting in the following scores: AUROC: 0.78|AUPRC: 0.083|Accuracy: 0.83|F-measure: 0.09|Utility: 0.33 Conclusion: Although no hyperparameter optimization was used, the LSTM showed to be useful in the early detection of sepsis. As future work, it is proposed to apply other pre-processing techniques (e.g., adding medical knowledge to the imputation process) and other advanced network architectures that can help to improve the detection.