I agree that this site is using cookies. You can find further informations
here
.
X
Login
Merkliste (
0
)
Home
About us
Home About us
Our history
Profile
Press & public relations
Friends
The library in figures
Exhibitions
Projects
Training, internships, careers
Films
Services & Information
Home Services & Information
Lending and interlibrary loans
Returns and renewals
Training and library tours
My Account
Library cards
New to the library?
Download Information
Opening hours
Learning spaces
PC, WLAN, copy, scan and print
Catalogs and collections
Home Catalogs and Collections
Rare books and manuscripts
Digital collections
Subject Areas
Our sites
Home Our sites
Central Library
Law Library (Juridicum)
BB Business and Economics (BB11)
BB Physics and Electrical Engineering
TB Engineering and Social Sciences
TB Economics and Nautical Sciences
TB Music
TB Art & Design
TB Bremerhaven
Contact the library
Home Contact the library
Staff Directory
Open access & publishing
Home Open access & publishing
Reference management: Citavi & RefWorks
Publishing documents
Open Access in Bremen
zur Desktop-Version
Toggle navigation
Merkliste
1 Ergebnisse
1
Model-Based Offline Reinforcement Learning for Autonomous D..:
Li, Hao
;
Zhou, Xiao-Hu
;
Xie, Xiao-Liang
...
IEEE Transactions on Medical Robotics and Bionics. 6 (2024) 3 - p. 1054-1062 , 2024
Link:
https://doi.org/10.1109/tmrb.2024.3407349
RT Journal T1
Model-Based Offline Reinforcement Learning for Autonomous Delivery of Guidewire
UL https://suche.suub.uni-bremen.de/peid=cr-10.1109_tmrb.2024.3407349&Exemplar=1&LAN=DE A1 Li, Hao A1 Zhou, Xiao-Hu A1 Xie, Xiao-Liang A1 Liu, Shi-Qi A1 Feng, Zhen-Qiu A1 Gui, Mei-Jiang A1 Xiang, Tian-Yu A1 Huang, De-Xing A1 Hou, Zeng-Guang PB Institute of Electrical and Electronics Engineers (IEEE) YR 2024 SN 2576-3202 JF IEEE Transactions on Medical Robotics and Bionics VO 6 IS 3 SP 1054 OP 1062 LK http://dx.doi.org/https://doi.org/10.1109/tmrb.2024.3407349 DO https://doi.org/10.1109/tmrb.2024.3407349 SF ELIB - SuUB Bremen
Export
RefWorks (nur Desktop-Version!)
Flow
(Zuerst in
Flow
einloggen, dann importieren)