JBL

Continuous-Time Markov Decision Processes: Theory and Applications (Stochastic

Description: Continuous-time Markov decision processes (MDPs), also known as controlled Markov chains, are used for modeling decision-making problems that arise in operations research (for instance, inventory, manufacturing, and queueing systems), computer science, communications engineering, control of populations (such as fisheries and epidemics), and management science, among many other fields. This volume provides a unified, systematic, self-contained presentation of recent developments on the theory and applications of continuous-time MDPs. The MDPs in this volume include most of the cases that arise in applications, because they allow unbounded transition and reward/cost rates. Much of the material appears for the first time in book form. Onésimo Hernández-Lerma received the Science and Arts National Award from the Government of MEXICO in 2001, an honorary doctorate from the University of Sonora in 2003, and the Scopus Prize from Elsevier in 2008. Xianping Guo received the He-Pan-Qing-Yi Best Paper Award from the 7th Word Congress on Intelligent Control and Automation in 2008.

Price: 108 USD

Location: Matraville, NSW

End Time: 2024-12-05T07:07:36.000Z

Shipping Cost: 0 USD

Product Images

Continuous-Time Markov Decision Processes: Theory and Applications (StochasticContinuous-Time Markov Decision Processes: Theory and Applications (Stochastic

Item Specifics

Return shipping will be paid by: Buyer

All returns accepted: Returns Accepted

Item must be returned within: 60 Days

Refund will be given as: Money Back

Return policy details:

EAN: 9783642260728

UPC: 9783642260728

ISBN: 9783642260728

MPN: N/A

Item Height: 1.4 cm

Number of Pages: Xviii, 234 Pages

Language: English

Publication Name: Continuous-Time Markov Decision Processes : Theory and Applications

Publisher: Springer Berlin / Heidelberg

Subject: Probability & Statistics / Stochastic Processes, Decision-Making & Problem Solving, Probability & Statistics / General, Operations Research, Optimization

Publication Year: 2012

Item Weight: 16 Oz

Type: Textbook

Item Length: 9.3 in

Subject Area: Mathematics, Business & Economics

Author: Xianping Guo, Onésimo Hernández-Lerma

Item Width: 6.1 in

Series: Stochastic Modelling and Applied Probability Ser.

Format: Trade Paperback

Recommended

Costa - Continuous-Time Markov Jump Linear Systems - New paperback or  - N555z
Costa - Continuous-Time Markov Jump Linear Systems - New paperback or - N555z

$157.21

View Details
Continuous-Time Markov Jump Linear Systems - 9783642431128
Continuous-Time Markov Jump Linear Systems - 9783642431128

$94.23

View Details
Continuous-Time Markov Chains - 9781461277729
Continuous-Time Markov Chains - 9781461277729

$114.74

View Details
Probability Models in Operations Research, Paperback by Cassady, C. Richard; ...
Probability Models in Operations Research, Paperback by Cassady, C. Richard; ...

$98.86

View Details
Continuous-Time Markov Decision Processes: Borel Space Models and General Contro
Continuous-Time Markov Decision Processes: Borel Space Models and General Contro

$244.30

View Details
Prieto-Rumeau - Selected Topics on Continuous-Time Controlled Markov  - S9000z
Prieto-Rumeau - Selected Topics on Continuous-Time Controlled Markov - S9000z

$132.08

View Details
Continuous-Time Markov Decision Processes: Borel Space Models and General: New
Continuous-Time Markov Decision Processes: Borel Space Models and General: New

$188.12

View Details
Continuous-Time Markov Decision Processes : Theory and Applications, Hardcove...
Continuous-Time Markov Decision Processes : Theory and Applications, Hardcove...

$134.10

View Details
Continuous-time Markov Decision Processes : Borel Space Models and General Co...
Continuous-time Markov Decision Processes : Borel Space Models and General Co...

$106.39

View Details
Continuous-Time Markov Decision Processes: Theory and Applications by Xianping G
Continuous-Time Markov Decision Processes: Theory and Applications by Xianping G

$136.61

View Details