There is widespread vitamin and mineral deficiency problem in Tanzania with known deficiencies of at least vitamin A, iron, folate and zinc, resulting in lasting negative consequences especially on maternal health, cognitive development and thus the nation’s economic potential. Folate deficiency is associated with significant adverse health effects among women of reproductive age, including a higher risk of neural tube defects. Several countries, including Tanzania, have implemented mandatory fortification of wheat and maize flour but evidence on the effectiveness of these programs in developing countries remains limited. We evaluated the effectiveness of Tanzania’s food fortification program by examining folate levels for women of reproductive age, 18–49 years. A prospective cohort study with 600 non-pregnant women enrolled concurrent with the initiation of food fortification and followed up for 1 year thereafter. Blood samples, dietary intake and fortified foods consumption data were collected at baseline, and at 6 and 12 months. Plasma folate levels were determined using a competitive assay with folate binding protein. Using univariate and multivariate linear regression, we compared the change in plasma folate levels at six and twelve months of the program from baseline. We also assessed the relative risk of folate deficiency during follow-up using log-binomial regression. The mean (±SE) pre–fortification plasma folate level for the women was 5.44-ng/ml (±2.30) at baseline. These levels improved significantly at six months [difference: 4.57ng/ml (±2.89)] and 12 months [difference: 4.27ng/ml (±4.18)]. Based on plasma folate cut-off level of 4 ng/ml, the prevalence of folate deficiency was 26.9% at baseline, and 5% at twelve months. One ng/ml increase in plasma folate from baseline was associated with a 25% decreased risk of folate deficiency at 12 months [(RR = 0.75; 95% CI = 0.67–0.85, P600kcals and <4500kcals. Venous blood specimens (about 4 ml) were collected in purple top tubes containing K2EDTA using standard venepuncture procedures, and samples were sent to the Africa Academy of Public Health (AAPH) laboratory within two hours of collection. Once in the laboratory, specimens were centrifuged at 3200 rpm for ten minutes to obtain plasma that was stored in a -20C freezer in 1 to 3ml vials. Laboratory scientists made sure that specimens were free from hemolysis, lipemia and icterus. Plasma folate assays were done using the Cobas e411 automated analyzer (Roche Diagnostics, Switzerland) as per manufacturer instructions. The machine was calibrated by lyophilized human plasma with folate. Adding 1.0 ml of distilled water carefully dissolved the contents of the calibrators. The calibrators were mixed carefully, avoiding foam formation. Aliquots of the reconstituted calibrators were transferred into empty vials and stored at -20°C till needed. Whenever required, a pair of calibrators were brought out and thawed before use at room temperature. Testing proceeded after successful calibration, with 250–300μl of each sample run in duplicate and average readings recorded. Negative and positive controls were used for quality assurance in each run. Controls were run individually at least once per 24 hours while the test was in use, once per new reagent kit and after each passed calibration. Printed results were certified by one laboratory personnel and reviewed by another one, before entered into a database. The normal range for the equipment is 1.5–20.0 ng/ml. We analysed change in mean plasma folate levels from baseline, to 6, and 12 months of follow-up. Mean (±SD) values of total energy intake (in calories/day), macronutrient proportions of the total calorie intake, as well as intake of fortified wheat-based foods (in servings/day) were calculated to estimate folic acid intake at baseline. We estimated the change in levels of these measures from baseline, and assessed statistical significance using a paired Student’s T-test. The concentrations suggested for defining folate deficiency based metabolic indicators range from 3–4 ng/ml [15,16,34]. This value is derived from data related to preventing anaemia and hyperhomocysteneinemia and the public significance of applying same cut-off in isolating folate deficiency in the context of NTDs is not fully understood[15,34]. Evidence suggests risk of NTD increases with folate insufficiency levels higher than ranges defining folate deficiency[34]. We dichotomized plasma folate levels based on the stringent threshold indicative of folate deficiency using a cut-off point of 4 ng/ml[15,16]. We fit univariate and multivariate binomial regression models to assess the degree to which each unit change in plasma folate led to a change in the risk of folate deficiency at six and twelve months, and obtained risk ratio estimates[35]. In multivariate models, we included potential confounders known to be associated with folic acid intake and/or plasma folate levels, or have been identified in regression models (p<0.2) to be significantly related to dietary intake of folic acid at baseline. Relative risks were adjusted for age (18 to <2 6, 26 to <36 and 36–49 years), years of formal education (0–7 years, 8–11 years and ≥12 years), occupation (business/professional, skilled formal, skilled informal, unskilled, unemployed), body mass index (<18.5, 18.5 to <25, 25 to <30, ≥30 kg/m2), household dietary diversity score (1–12), baseline intake of fortified wheat-based foods (servings per day), intake of vegetables and total energy intake (kcals/day).The number of household assets were computed from a simple count that included TV, radio, generator, fan, bike, car, couch, fridge, as well as access to electricity and potable water, allowing classification into 3 socioeconomic groups thus: 0–5, 6–8, and 9–10 [36]. Covariates with missing data were retained in the analysis using the missing indicator method[37].P-values were two-sided and significance was set at < 0.05. Final data set S2 File was compiled and all statistical analyses were conducted using SAS version 9.2 (SAS Institute Inc).