Egocentric distance perception over distances of 2–20 meters has been extensively studied within real world environments and within virtual environments (VEs) using head-mounted displays (HMDs). Not as much investigation has been performed within projection-based VEs, partly because of measurement restrictions imposed by the limited spatial constraints of the projection-based hardware. A standard measurement technique is blind-folded walking, in which subjects observe the object, close their eyes, and walk to where they perceives the object to be. However, due to the limited space in front of projection-based displays, this technique is difficult to perform. To our best knowledge, there is only one technique, imagined walking [Plumert et al. 2005], applied in projection-based environments. We use and compare triangulated walking [Knapp 1999] to imagined walking and verbal estimation for projection-based environments.