βοΈBehind the Models
Last updated
Last updated
For model architecture, PredictoAI employs a combination of LSTM and Transformer models. LSTMs are adept at capturing long-term dependencies and understanding the time-series nature of market data, which is essential for recognizing patterns in price movements and trade volume over time. Transformer models, on the other hand, are particularly good at handling sequential data with their self-attention mechanisms, allowing them to weigh the importance of different points in a data sequence more effectively.
Our models are structured with multiple layers of neural networks that include attention mechanisms, which help the AI to focus on the most relevant pieces of data when making predictions. We use a combination of supervised and unsupervised learning techniques, where supervised learning helps in making direct predictions and unsupervised learning aids in uncovering hidden structures within the data.
We implement these models using TensorFlow, a flexible platform that supports rapid iteration and experimentation. This allows us to refine our models continually, improving their accuracy with each iteration.