FDA

Definition of FDA

FDA

An agency in the U.S. federal government whose mission is to protect public health by making sure that food, cosmetics, and nutritional supplements are safe to use and truthfully labeled. The FDA also makes sure that drugs, medical devices, and equipment are safe and effective, and that blood for transfusions and transplant tissue are safe. Also called Food and Drug Administration.

Source: NCI Dictionary of Cancer Terms