I agree that this site is using cookies. You can find further informations
here
.
X
Login
Merkliste (
0
)
Home
About us
Home About us
Our history
Profile
Press & public relations
Friends
The library in figures
Exhibitions
Projects
Training, internships, careers
Films
Services & Information
Home Services & Information
Lending and interlibrary loans
Returns and renewals
Training and library tours
My Account
Library cards
New to the library?
Download Information
Opening hours
Learning spaces
PC, WLAN, copy, scan and print
Catalogs and collections
Home Catalogs and Collections
Rare books and manuscripts
Digital collections
Subject Areas
Our sites
Home Our sites
Central Library
Law Library (Juridicum)
BB Business and Economics (BB11)
BB Physics and Electrical Engineering
TB Engineering and Social Sciences
TB Economics and Nautical Sciences
TB Music
TB Art & Design
TB Bremerhaven
Contact the library
Home Contact the library
Staff Directory
Open access & publishing
Home Open access & publishing
Reference management: Citavi & RefWorks
Publishing documents
Open Access in Bremen
zur Desktop-Version
Toggle navigation
Merkliste
1 Ergebnisse
1
A Multitier Reinforcement Learning Model for a Cooperative ..:
Shi, Haobin
;
Zhai, Liangjing
;
Wu, Haibo
...
IEEE Transactions on Cognitive and Developmental Systems. 12 (2020) 3 - p. 636-644 , 2020
Link:
https://doi.org/10.1109/tcds.2020.2970487
RT Journal T1
A Multitier Reinforcement Learning Model for a Cooperative Multiagent System
UL https://suche.suub.uni-bremen.de/peid=cr-10.1109_tcds.2020.2970487&Exemplar=1&LAN=DE A1 Shi, Haobin A1 Zhai, Liangjing A1 Wu, Haibo A1 Hwang, Maxwell A1 Hwang, Kao-Shing A1 Hsu, Hsuan-Pei PB Institute of Electrical and Electronics Engineers (IEEE) YR 2020 SN 2379-8920 SN 2379-8939 JF IEEE Transactions on Cognitive and Developmental Systems VO 12 IS 3 SP 636 OP 644 LK http://dx.doi.org/https://doi.org/10.1109/tcds.2020.2970487 DO https://doi.org/10.1109/tcds.2020.2970487 SF ELIB - SuUB Bremen
Export
RefWorks (nur Desktop-Version!)
Flow
(Zuerst in
Flow
einloggen, dann importieren)