dc.creatorTzu-Chia, Chen
dc.creatorAlazzawi, Fouad Jameel Ibrahim
dc.creatorGrimaldo Guerrero, John William
dc.creatorChetthamrongchai, Paitoon
dc.creatorDorofeev, Aleksei
dc.creatorAras masood, Ismael
dc.creatorAhmed, Dr. Alim Al Ayub
dc.creatorAkhmadeev, Ravil
dc.creatorLatipah, Asslia Johar
dc.creatorAbu Al-Rejal, Hussein
dc.date2022-04-05T12:48:49Z
dc.date2022-04-05T12:48:49Z
dc.date2022
dc.date.accessioned2023-10-03T19:52:48Z
dc.date.available2023-10-03T19:52:48Z
dc.identifier1024-123X
dc.identifierhttps://hdl.handle.net/11323/9115
dc.identifierhttps://doi.org/10.1155/2022/3693263
dc.identifier10.1155/2022/3693263
dc.identifier1563-5147
dc.identifierCorporación Universidad de la Costa
dc.identifierREDICUC - Repositorio CUC
dc.identifierhttps://repositorio.cuc.edu.co/
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/9172922
dc.descriptionThe hybrid energy storage systems are a practical tool to solve the issues in single energy storage systems in terms of specific power supply and high specific energy. These systems are especially applicable in electric and hybrid vehicles. Applying a dynamic and coherent strategy plays a key role in managing a hybrid energy storage system. The data obtained while driving and information collected from energy storage systems can be used to analyze the performance of the provided energy management method. Most existing energy management models follow predetermined rules that are unsuitable for vehicles moving in different modes and conditions. Therefore, it is so advantageous to provide an energy management system that can learn from the environment and the driving cycle and send the needed data to a control system for optimal management. In this research, the machine learning method and its application in increasing the efficiency of a hybrid energy storage management system are applied. In this regard, the energy management system is designed based on machine learning methods so that the system can learn to take the necessary actions in different situations directly and without the use of predicted select and run the predefined rules. The advantage of this method is accurate and effective control with high efficiency through direct interaction with the environment around the system. The numerical results show that the proposed machine learning method can achieve the least mean square error in all strategies.
dc.format8 páginas
dc.formatapplication/pdf
dc.formatapplication/pdf
dc.languageeng
dc.publisherHindawi Publishing Corporation
dc.publisherUnited States
dc.relationMathematical Problems in Engineering
dc.relation[1] F. Arslan, “A review of machine learning models for software cost estimation,” Review of Computer Engineering Research, vol. 6, no. 2, pp. 64–75, 2019.
dc.relation[2] A. Manishimwe, H. Alexander, H. Kaluuma, and M. Dida, “Integrated mobile application based on machine learning for East Africa stock market,” Journal of Information Systems Engineering & Management, vol. 6, no. 3, Article ID em0143, 2021.
dc.relation[3] L. H. Salazar, A. Fernandes, R. Dazzi, N. Garcia, and V. R. Leithardt, “Using different models of machine learning to predict attendance at medical appointments,” Journal of Information Systems Engineering and Management, vol. 5, no. 4, Article ID em0122, 2020.
dc.relation[4] R. Gaussmann, D. Coelho, A. Fernandes, P. Crocker, and V. R. Q. Leithardt, “Estimated maintenance costs of Brazilian highways using machine learning algorithms,” Journal of Information Systems Engineering and Management, vol. 5, no. 3, Article ID em0119, 2020.
dc.relation[5] Y.-P. Xu, P. Ouyang, S.-M. Xing, L.-Y. Qi, M. khayatnezhad, and H. Jafari, “Optimal structure design of a PV/FC HRES using amended Water Strider Algorithm,” Energy Reports, vol. 7, pp. 2057–2067, 2021.
dc.relation[6] A. Ma, J. Ji, and M. Khayatnezhad, “Risk-constrained nonprobabilistic scheduling of coordinated power-to-gas conversion facility and natural gas storage in power and gas based energy systems,” Sustainable Energy, Grids and Networks, vol. 26, Article ID 100478, 2021.
dc.relation[7] R. A. Salas-Rueda, “Analysis on the use of continuous improvement, technology and flipped classroom in the teachinglearning process by means of data science,” ,e Online Journal of Communication and Media Technologies, vol. 8, no. 4, pp. 325–343, 2018.
dc.relation[8] C. D´epature, S. Jemei, L. Boulon et al., “Energy management in fuel-cell\\/battery vehicles: key issues identified in the IEEE vehicular technology society motor vehicle challenge 2017,” IEEE Vehicular Technology Magazine, vol. 13, no. 3, pp. 144–151, 2018.
dc.relation[9] B. Subiyakto and S. Kot, “,e government reform on healthcare facilities from the standpoint of service quality performance,” International Journal of Economics and Finance Studies, vol. 12, no. 1, pp. 16–31, 2020.
dc.relation[10] A. Mostafaeipour, M. Qolipour, H. Goudarzi et al., “Implementation of adaptive neuro-fuzzy inference system (ANFIS) for performance prediction of fuel cell parameters,” Journal of Renewable Energy and Environment, vol. 6, no. 3, pp. 7–15, 2019.
dc.relation[11] L. Mapfumo and S. Mutereko, “Contextualising stakeholder participation in the governance of Harare’s informal economy sector,” International Journal of Economics and Finance Studies, vol. 12, no. 1, pp. 103–118, 2020.
dc.relation[12] A. J. De Bruyn, “Harnessing Hr governance in effective virtual teams,” International Journal of Social Sciences and Humanity Studies, vol. 12, no. 1, pp. 1–17, 2020.
dc.relation[13] A. Mostafaeipour, A. Goli, and M. Qolipour, “Prediction of air travel demand using a hybrid artificial neural network (ANN) with Bat and Firefly algorithms: a case study,” ,e Journal of Supercomputing, vol. 74, no. 10, pp. 5461–5484, 2018.
dc.relation[14] N. E. L. Danielle and L. Masilela, “Open governance for improved service delivery innovation in South Africa,” International Journal of eBusiness and eGovernment Studies, vol. 12, no. 1, pp. 33–47, 2020.
dc.relation[15] K. Mosala and E. Chinomona, “Motorists attitudes towards implementation of E-tolls in Gauteng Province, South Africa,” International Journal of eBusiness and eGovernment Studies, vol. 12, no. 1, pp. 48–62, 2020.
dc.relation[16] A. H. Rasouli Amirabadi and M. Mirzaei, “Photosensitization of coronene–purine hybrids for photodynamic therapy,” Quarterly Journal of Iranian Chemical Communication, vol. 7, no. 4, pp. 352–471, 2019.
dc.relation[17] A. El-Khateeb, “Practical biochemistry p rinciples and techniques approach,” Progress in Chemical and Biochemical Research, vol. 3, no. 3, pp. 180–193, 2020.
dc.relation[18] I. H. Sarker, “Machine learning: algorithms, real-world applications and research directions,” SN Computer Science, vol. 2, no. 3, pp. 1–21, 2021.
dc.relation[19] A. Rahimian Boogar, A. Gholamalizadeh Ahangar, and E. Shirmohamadi, “Efficiency of foliar application of humic acid on improve absorb of K than Na and salt tolerance in petunia hybrida L,” International Journal of Advanced Biological and Biomed ical Research, vol. 2, no. 42, pp. 256–259.
dc.relation[20] M. Rahimi, S. Jamehbozorgi, H. Chermette, R. Ghiasi, and M. Poor Kalhor, “Computational study of substituent effect on the electronic properties of ferrocylidene acetophenones complexes,” Eurasian Chemical Communications, vol. 1, no. 5, pp. 411–418, 2019.
dc.relation[21] Y. Wang, X. Wu, B. Shao, X. Yang, G. Owens, and H. Xu, “Boosting solar steam generation by structure enhanced energy management,” Science Bulletin, vol. 65, no. 16, pp. 1380–1388, 2020.
dc.relation[22] W. Bi, Y. Shu, W. Dong, and Q. Yang, “Real-time energy management of microgrid using reinforcement learning,” in Proceedings of the 2020 19th International Symposium on Distributed Computing and Applications for Business Engineering and Science (DCABES), pp. 38–41, IEEE, Xuzhou, China, October 2020.
dc.relation[23] Z. Yang, F. Wu, and J. Zhao, “A survey: limited data problem and strategy of reinforcement learning,” in Proceedings of the 2021 Chinese Intelligent Automation Conference, Springer, Singapore, pp. 471–481, 2022.
dc.relation[24] A. Chatzivasileiadi, E. Ampatzi, and I. Knight, “Characteristics of electrical energy storage technologies and their applications in buildings,” Renewable and Sustainable Energy Reviews, vol. 25, pp. 814–830, 2013.
dc.relation[25] O. Z. Sharaf and M. F. Orhan, “An overview of fuel cell technology: fundamentals and applications,” Renewable and Sustainable Energy Reviews, vol. 32, pp. 810–853, 2014.
dc.relation[26] R. S. Sutton and A. G. Barto, Reinforcement Learning: An Introduction, MIT Press, London UK, 2018.
dc.relation[27] R. C. Hsu, C. T. Liu, and D. Y. Chan, “A reinforcementlearning-based assisted power management with QoR provisioning for human–electric hybrid bicycle,” IEEE Transactions on Industrial Electronics, vol. 59, no. 8, pp. 3350–3359, 2011.
dc.relation[28] X. Qi, G. Wu, K. Boriboonsomsin, M. J. Barth, and J. Gonder, “Data-driven reinforcement learning-based real-time energy management system for plug-in hybrid electric vehicles,” Transportation Research Record: Journal of the Transportation Research Board, vol. 2572, no. 1, pp. 1–8, 2016.
dc.relation[29] C. Liu and Y. L. Murphey, “Power management for plug-in hybrid electric vehicles using reinforcement learning with trip information,” in Proceedings of the 2014 IEEE Transportation Electrification Conference and Expo (ITEC), pp. 1–6, IEEE, Dearborn, MI, USA, June 2014.
dc.relation[30] T. Liu, Y. Zou, D. Liu, and F. Sun, “Reinforcement learning of adaptive energy management with transition probability for a hybrid electric tracked vehicle,” IEEE Transactions on Industrial Electronics, vol. 62, no. 12, pp. 7837–7846, 2015.
dc.relation[31] J. Hu, D. Liu, C. Du, F. Yan, and C. Lv, “Intelligent energy management strategy of hybrid energy storage system for electric vehicle based on driving pattern recognition,” Energy, vol. 198, p. 117298, 2020.
dc.relation[32] A. A. Kamel, H. Rezk, and M. A. Abdelkareem, “Enhancing the operation of fuel cell-photovoltaic-battery-supercapacitor renewable system through a hybrid energy management strategy,” International Journal of Hydrogen Energy, vol. 46, no. 8, pp. 6061–6075, 2021.
dc.relation[33] P. Zhao, Y. Wang, N. Chang, Q. Zhu, and X. Lin, “A deep reinforcement learning framework for optimizing fuel economy of hybrid electric vehicles,” in Proceedings of the 2018 23rd Asia and South Pacific design automation conference (ASP-DAC), pp. 196–202, IEEE, Jeju, South Korea, January 2018.
dc.relation[34] R. Xiong, J. Cao, and Q. Yu, “Reinforcement learning-based real-time power management for hybrid energy storage system in the plug-in hybrid electric vehicle,” Applied Energy, vol. 211, pp. 538–548, 2018.
dc.relation[35] H. Yu, M. Kuang, and R. McGee, “Trip-oriented energy management control strategy for plug-in hybrid electric vehicles,” in Proceedings of the 2011 50th IEEE Conference on Decision and Control and European Control Conference, pp. 1323–1336, Orlando, FL, USA, January 2013.
dc.relation[36] X. Lin, Z. Wang, and J. Wu, “Energy management strategy based on velocity prediction using back propagation neural network for a plug-in fuel cell electric vehicle,” International Journal of Energy Research, vol. 45, no. 2, pp. 2629–2643, 2021.
dc.relation8
dc.relation1
dc.relation2022
dc.rightsCopyright © 2022 Tzu-Chia Chen et al.
dc.rightsAtribución 4.0 Internacional (CC BY 4.0)
dc.rightshttps://creativecommons.org/licenses/by/4.0/
dc.rightsinfo:eu-repo/semantics/openAccess
dc.rightshttp://purl.org/coar/access_right/c_abf2
dc.sourcehttps://www.hindawi.com/journals/mpe/2022/3693263/
dc.subjectMachine learning
dc.subjectHybrid energy
dc.subjectElectric vehicles
dc.subjectStorage systems
dc.titleDevelopment of machine learning methods in hybrid energy storage systems in electric vehicles
dc.typeArtículo de revista
dc.typehttp://purl.org/coar/resource_type/c_6501
dc.typeText
dc.typeinfo:eu-repo/semantics/article
dc.typeinfo:eu-repo/semantics/publishedVersion
dc.typehttp://purl.org/redcol/resource_type/ART
dc.typeinfo:eu-repo/semantics/acceptedVersion
dc.typehttp://purl.org/coar/version/c_ab4af688f83e57aa


Este ítem pertenece a la siguiente institución