Combining deep reinforcement learning with technical analysis and trend monitoring on cryptocurrency markets

DOI: 10.1007/s00521-023-08516-x

Thank you for reading this post, don’t forget to subscribe!

ABSTRACT: Abstract Cryptocurrency markets experienced a significant increase in the popularity, which motivated many financial traders to seek high profits in cryptocurrency trading. The predominant tool that traders use to identify profitable opportunities is technical analysis. Some investors and researchers also combined technical analysis with machine learning, in order to forecast upcoming trends in the market. However, even with the use of these methods, developing successful trading strategies is still regarded as an extremely challenging task. Recently, deep reinforcement learning (DRL) algorithms demonstrated satisfying performance in solving complicated problems, including the formulation of profitable trading strategies. While some DRL techniques have been successful in increasing profit and loss (PNL) measures, these techniques are not much risk-aware and present difficulty in maximizing PNL and lowering trading risks simultaneously. This research proposes the combination of DRL approaches with rule-based safety mechanisms to both maximize PNL returns and minimize trading risk. First, a DRL agent is trained to maximize PNL returns, using a novel reward function. Then, during the exploitation phase, a rule-based mechanism is deployed to prevent uncertain actions from being executed. Finally, another novel safety mechanism is proposed, which considers the actions of a more conservatively trained agent, in order to identify high-risk trading periods and avoid trading. Our experiments on 5 popular cryptocurrencies show that the integration of these three methods achieves very promising results.

– Combination of DRL approaches with rule-based safety mechanisms achieves promising results.
– Integration of three methods maximizes PNL returns and minimizes trading risk.

– Combination of DRL approaches with rule-based safety mechanisms achieves promising results.
– Integration of three methods maximizes PNL returns and minimizes trading risk.

– Combination of DRL approaches with rule-based safety mechanisms achieves promising results.
– Integration of three methods maximizes PNL returns and minimizes trading risk.

– DRL techniques are not much risk-aware and have difficulty in maximizing PNL and lowering trading risks simultaneously.
– The integration of DRL approaches with rule-based safety mechanisms is proposed to address this limitation.

Methods used: – DRL techniques are not much risk-aware and have difficulty in maximizing PNL and lowering trading risks simultaneously.
– The integration of DRL approaches with rule-based safety mechanisms is proposed to address this limitation.

– Combination of DRL and technical analysis can lead to profitable trading strategies.
– Integration of DRL with rule-based safety mechanisms can maximize PNL returns and minimize trading risk.

– The integration of DRL approaches with rule-based safety mechanisms achieves promising results.
– The performance of the Integrated TraderNet-CR architecture is evaluated on five cryptocurrency markets.

– Combination of deep reinforcement learning (DRL) with technical analysis and trend monitoring on cryptocurrency markets.
– Integration of DRL approaches with rule-based safety mechanisms to maximize PNL returns and minimize trading risk.

– Cryptocurrency markets have gained popularity, attracting traders and investors.
– Technical analysis and machine learning are used to forecast market trends.

In this article , a combination of deep reinforcement learning (DRL) and rule-based safety mechanisms is proposed to both maximize profit and loss (PNL) returns and minimize trading risk.

More posts