We develop a constructive approach to estimating a sparse linear regression model in high-dimensions. The proposed approach is a computational algorithm that generates a sequence of solutions iteratively, based on support detection using primal and dual information and root finding according to a modified KKT condition for the L0-penalized least squares criterion. We refer to the proposed algorithm as SDAR for brevity. Under certain regularity conditions on the design matrix and sparsity assumption on the regression coefficients, we show that with high probability, the errors of the solution sequence decay exponentially to the minimax error bounds in the Gaussian noise case. Moreover, with high probability, it takes no more than O(log(R)) steps to recover the oracle estimator, where R is the relative magnitude of the nonzero coefficients. We obtain similar results in the case where the model is not strictly sparse but most of the regression coefficients are small. Computational complexity analysis shows that the cost of SDAR is O(np) in each step if the regression coefficients are sparse or can be sparsely approximated. Simulation studies support our theoretical results. We also consider ASDAR, an adaptive version of SDAR to make it more practical in applications. Numerical comparisons with Lasso, MCP and several greedy methods indicate that ASDAR is competitive with or outperforms them in accuracy and efficiency.