Projects per year
Abstract
The channel capacity of a deterministic system with confidential data is
an upper bound on the amount of bits of data an attacker can learn from the system.
We encode all possible attacks to a system using a probabilistic specification, an Interval Markov Chain. Then the channel capacity computation reduces to finding a model of a specification with highest entropy.
Entropy maximization for probabilistic process specifications has not been studied before, even though it is well known in Bayesian inference for discrete distributions.
We give a characterization of global entropy of a process as a reward function, a polynomial algorithm to verify the existence of an system maximizing entropy among those respecting a specification, a procedure for the maximization of reward functions over Interval Markov Chains and its application to synthesize an implementation maximizing entropy.
We show how to use Interval Markov Chains to model abstractions of deterministic systems with confidential data, and use the above results to compute their channel capacity. These results are a foundation for ongoing work on computing channel capacity for abstractions of programs derived from code.
an upper bound on the amount of bits of data an attacker can learn from the system.
We encode all possible attacks to a system using a probabilistic specification, an Interval Markov Chain. Then the channel capacity computation reduces to finding a model of a specification with highest entropy.
Entropy maximization for probabilistic process specifications has not been studied before, even though it is well known in Bayesian inference for discrete distributions.
We give a characterization of global entropy of a process as a reward function, a polynomial algorithm to verify the existence of an system maximizing entropy among those respecting a specification, a procedure for the maximization of reward functions over Interval Markov Chains and its application to synthesize an implementation maximizing entropy.
We show how to use Interval Markov Chains to model abstractions of deterministic systems with confidential data, and use the above results to compute their channel capacity. These results are a foundation for ongoing work on computing channel capacity for abstractions of programs derived from code.
Original language  English 

Journal  Journal of Logic and Algebraic Programming 
Volume  83 
Issue number  56 
Pages (fromto)  384399 
Number of pages  12 
ISSN  23522208 
DOIs  
Publication status  Published  2014 
Keywords
 Channel Capacity
 Deterministic Systems
 Confidential Data
 Interval Markov Chains
 Entropy Maximization
Fingerprint
Dive into the research topics of 'Maximizing Entropy over Markov Processes'. Together they form a unique fingerprint.Projects
 1 Finished

MTLab  Modelling of Information Technology
Wasowski, A. (CoI), Godskesen, J. C. (PI), Song, L. (CoI), Traonouez, L.M. (CoI) & Biondi, F. (CoI)
01/11/2008 → 31/10/2013
Project: Research