Abstract
We introduce and perform numerical simulations of a lattice model for the compaction of a granular system based on the ideas of random sequential adsorption and diffusional relaxation. The lattice is composed by a given number of horizontal layers where nonoverlapping particles diffuse. Besides diffusion within its own layer a particle suffers a downfall to the layer below whenever there is enough space there. We restrict ourselves to the case of one-dimensional layers and particles that occupy k consecutive sites. We observe time algebraic decay in the density of particles with exponents that are in several cases distinct from mean-field values.