Modern wireless network architectures are increasingly exploring the use of intermediate radio nodes, called relays, to improve system performance in terms of coverage and capacity. Beyond significant modeling and theoretical analysis, ongoing integration of relays into the Long Term Evolution (LTE) cellular standard promises many key benefits but also poses many challenges. This dissertation develops theoretical and practical tools for evaluating the effect that different relaying approaches have on the fundamental tradeoffs in radio communication systems among the rate, reliability, and delay with which data is transmitted. These tools range from information theoretic, giving general insight into encoding and decoding procedures, to experimental, allowing the performance quantification of real-world implementations. After reviewing related literature, we develop an achievable rate-reliability-delay tradeoff for decode-and-forward on the general relay channel by extending Gallager's random coding error exponent from the point-to-point case. The achievable tradeoff varies depending on the encoding and decoding approaches taken and can be improved by increasing the decoding window size at the destination. We next propose a number of low latency relay processing schemes for the multihop channel. By having the relay exploit only part of the source's channel code structure, we can adaptively tradeoff increased latency for greater reliability. Simulations confirm that these schemes give better performance than existing techniques for certain network geometries for both convolutional codes and the turbo code employed in LTE. Finally, we implement a wireless relaying testbed whose design and parameter selection is itself inspired by LTE. Experimentation captures more effects of various protocol layers when evaluating the relationship between rate, reliability, and delay for different relay types. Initial characterization experiments with the testbed are reported and directions for future development are suggested.