The answer is negative unless the relationship between X and Y is 'perfectly linear' and 'deterministic' (e.g. a line equation where regression residuals and coefficient standard errors are all zero). In essence, regression of y on x solves a different optimization problem than regression of x on y. Let's say y=a+b*x+r gives the same line as x=-a/b + 1/b*x +e. For the two regression lines to coincide, the slope estimates must be non-zero and reciprocals of each other. From the least squares we recall that b=cov(x,y)/var(x) and 1/b=var(y)/cov(x,y), equivalent to cov(x,y)^2=var(x)*var(y), implying that correlation between x and y is either 1 or -1.
The difference between regression of y on x and regression of x on y can also be empirically observed through a simple example using the famous "mtcars" data in R and constructing the following regressions of "mpg~disp" and "disp~mpg" where the slopes of the two regressions are not reciprocals:
Finally, another important consideration is the causal relationship between x and y may not be invertible. In practice, it may be the case that independent variable x causes dependent variable y, but the inverse is not true. For example, increase in unemployment rate may result in loans default rates (i.e. borrowers inability to pay back) but the inverse relation doesn't hold true.