Biomedical image analysis plays a vital role in disease investigation and computer-aided diagnosis. Operations like target object segmentation, and classification are principal tools used in biomedical image study. In medical images, a segmentation task is performed to extract regions of interest from the background. Medical image classification is an essential task performed to detect the existence of a target disease. Both segmentation and classification are fundamental problems in the medical imaging research area, which can provide insights for quantitative biological research and disease studies.In the last decade, convolutional neural networks (CNNs) have achieved much success in biomedical image analysis. However, in most cases, standardized CNN models are used irrespective of the data under consideration. The absence of input data domain-specific knowledge impedes such boilerplate CNNs performance and accuracy. Since each dataset has its own domain-specific challenges, we believe data-driven approaches can improve model performance.In this dissertation, multiple data-driven approaches are proposed for CNN based biomedical image segmentation and classification models. Exploiting domain-specific prior knowledge to improve the model's performance and accuracy is the main objective of this work. Specifically, we consider data-driven approaches from three specific directions of segmentation/classification model designing: (1) network architecture, (2) requirement-based modeling, and (3) application driven designing. From a network architecture perspective, we present approaches such as data-driven deep supervision, graph convolution based topology inclusion, and uncertainty driven multi-objective learning. Image complexity guided network compression is presented to highlight the benefits of data-driven requirement based modeling for medical image segmentation. Finally, on data-driven approaches for application driven designing, we present our framework for the assessment of CNN based methods for vitiligo diagnosis.