Nowadays, with the extensive use of computers, huge amount of data are generated everyday. The need to understand large, complex and information-rich data sets is very important to many fields in business, science and engineering. However, due to the complicated patterns, complex hidden relationships, and multidimensional and multimodal natures of data, it is not easy for users to glean insights. These issues have become critical with the exponential growth of data over the years. In this dissertation, I focus on graph-based techniques to tackle those issues because graphs can be used to naturally encode various kinds of relationships, reveal complicated patterns, and allow users to easily identify features of interest. I demonstrate the effectiveness of graph-based representations by applying them to three kinds of data including time-varying volumetric data, image-text collections, and eye-tracking data.Time-varying volume visualization plays an essential role in many scientific, engineering and medical disciplines. Many works have presented novel algorithms and techniques for processing, managing and rendering time-varying volume data. However, two critical challenges still remain. The first challenge is addressing visual occlusion caused by projecting the 3D data to a 2D image during rendering. This problem is exacerbated when the time and variable dimensions are considered. The second challenge is the lack of capability to help users analyze and track data change or evolution. As the size of data keeps increasing, these challenges would only become more severe. This dissertation proposes three solutions that leverage the generalized, understandable and familiar form of graphs to address a wider range of problems for visualizing time-varying scientific data. First, I will present TransGraph that maps the evolution of a 4D space-time data set to a 2D graph representation so that the transition relationships among data could be revealed in a single graph. Through brushing and linking, users can track the evolution of spatiotemporal data over time. Second, in many cases, users can only rely on low-level visual hints from transition graphs, such as the sizes and densities of nodes and edges, to figure out interesting features to explore. This entails a heavy burden on users to identify interesting graph features and make connections to data features. I will therefore introduce graph analytics techniques, such as graph simplification, community detection, and visual recommendation, to help users explore the transition graph and understand the time-varying volumetric data. Third, I will present an indexable tree named iTree for time-varying data visualization which integrates efficient data compacting, indexing, querying and classification into a single framework.Visualization of image-text collections is very practical in our daily lives. A significant challenge is the lack of the ability to browse, navigate and explore the large heterogeneous collection in an adaptive manner. This dissertation presents a graph-based framework called iGraph that shows the relationships among images and texts in a dynamic fashion with progressive drawing capability. iGraph also supports node comparison and visual recommendation for guided navigation and exploration.Eye-tracking data visualization is important for understanding the patterns of eye movements. There are two major challenges that I aim to address. First, due to the revisiting of the screen, the scanpath may overlap with each other which not only causes visual occlusion, but also obscures the reading patterns. Second, as the number of participants increases, classification and cross-comparison of the participants become increasingly difficult. I introduce ETGraph, a graph-based analytics framework that visualizes the reading patterns, identifies saccade outliers, and classifies the participants.--In reference to IEEE copyrighted material which is used with permission in this thesis, the IEEE does not endorse any of the University of Notre Dame's products or services. Internal or personal use of this material is permitted. If interested in reprinting/republishing IEEE copyrighted material for advertising or promotional purposes or for creating new collective works for resale or redistribution, please go to http://www.ieee.org/publications_standards/publica... to learn how to obtain a License from RightsLink.