“Irregular applications, such as graph analytics and sparse linear algebra, exhibit frequent indirect, data-dependent accesses to single or short sequences of elements that cause high main memory ...
A team of researchers at ETH Zurich are working on a novel approach to solving increasingly large graph problems. Large graphs are a basis of many problems in social sciences (e.g., studying human ...
Abstract: The speed of algorithms on massive graphs depends on the size of the given data. Grammar-based compression is a technique to compress the size of a graph while still allowing to read or to ...
Information theory provides the fundamental framework for understanding and designing data compression algorithms. At its core lies the concept of entropy, a quantitative measure that reflects the ...
The rapid expansion of data volumes in modern applications has intensified the need for efficient methods of storing and retrieving information. Contemporary research in data compression focuses on ...
Meta has announced the OpenZL compression framework, which achieves high compression rates while maintaining high speed. By building dedicated compression programs optimized for specific formats, it ...
With more and more embedded systems being connected, sending state information from one machine to another has become more common. However, sending large packets of data around on the network can be ...
In context: Thanks to improvements in DNA sequencing techniques, the number of public repositories containing genetic data is growing at a significant pace. Zurich researchers have been working on a ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results