Javascript required
Skip to content Skip to sidebar Skip to footer

F Continuous Such That F X is in Q Show That F is Constant

You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an alternative browser.
  • Forums
  • Homework Help
  • Calculus and Beyond Homework Help

Show that f(x) is constant

  • Thread starter Jamin2112
  • Start date

Homework Statement

(I don't have my book with me, so this may not be the correct word-for-word representation of the exercise)

Suppose f(x) is differentiable on the whole real line. Show that f(x) is constant if for all real numbers x and y,


Homework Equations

Definition of a derivative

The Attempt at a Solution

f(x)-f(y) ≤ |f(x)-f(y)|, so by basic algebra we have [f(x)-f(y)]/(x-y) ≤ x - y. Letting x approach y on both sides of the inequality yields f '(x) ≤ 0.

........ Now I somehow need to show that f '(x) ≥ 0. Ideas?

Answers and Replies

Try using that:

|f(x)-f(y)| ≤ (x-y)2 = |x-y| |x-y|

Try using that:

|f(x)-f(y)| ≤ (x-y)2 = |x-y| |x-y|


0 ≤ |f(x)-f(y)| ≤ (x-y)2 = |x-y| |x-y|

----> 0 ≤ |f(x)-f(y)| / |x-y| ≤ |x-y|
----> f '(x) = 0 by the Squeeze Theorem

My book uses limt→x (f(t)-f(x))/(t-x) for f '(x), which I guess is equivalent to limt→x |f(t)-f(x)|/|t-x|.

Yep. :)

The latter limit is actually |f'(x)|.

You actually do not need to use calculus to prove this result! In particular, fix x0,y0 and set δ = (y0-x0)n-1. Then we have:

|f(y0)-f(x0)| ≤ Ʃ|f(x0+iδ)-f(x0+(i-1)δ)| ≤ Ʃn-2 ≤ n-1

Since our choice of n was arbitrary, this forces |f(y0)-f(x0)| = 0, or equivalently f(y0) = f(x0).

Last edited:
Yep. :)

The latter limit is actually |f'(x)|.


But I need to show that f '(x) = 0.

Wut do?

But I need to show that f '(x) = 0.

Wut do?


Indeed. You're work is fine.

You showed that |f'(x)|=0 which implies that f'(x)=0.

Indeed. You're work is fine.

You showed that |f'(x)|=0 which implies that f'(x)=0.


Thanks so much! I have another question.

Suppose f is defined and differentiable for every x > 0, and f '(x) → 0 as x → +∞. Put g(x) = f(x+1) - f(x). Prove that g(x) → 0 as x → +∞.

That g(x) → 0 as x → +∞ is quite obvious, and its proof should be too.

Attempt: Fix ε > 0. We seek a positive real number x' such that |f(x+1) - f(x)| < ε whenever x ≥ x'.

I also know that for any ∂ > 0 there exists a x* such that | limt→x (f(t) - f(x)) / (t-x) | < ∂ whenever x ≥ x*.

Where to go from here, I know not. Hint?

Thanks so much! I have another question.

Suppose f is defined and differentiable for every x > 0, and f '(x) → 0 as x → +∞. Put g(x) = f(x+1) - f(x). Prove that g(x) → 0 as x → +∞.

That g(x) → 0 as x → +∞ is quite obvious, and its proof should be too.

Attempt: Fix ε > 0. We seek a positive real number x' such that |f(x+1) - f(x)| < ε whenever x ≥ x'.

I also know that for any ∂ > 0 there exists a x* such that | limt→x (f(t) - f(x)) / (t-x) | < ∂ whenever x ≥ x*.

Where to go from here, I know not. Hint?


Try thinking about doing a proof by contradiction. Suppose g(x) does NOT go to zero. Can you use that to prove f'(x) does NOT go to zero?
The Mean value theorem states that for every interval [a,b] there is a value c in the interval (a,b) such that:
[tex]f'(c)={f(b)-f(a) \over b-a}[/tex]
assuming f is differentiable.

Perhaps you can use that?

Try thinking about doing a proof by contradiction. Suppose g(x) does NOT go to zero. Can you use that to prove f'(x) does NOT go to zero?

Suppose g(x) does not approach 0 as x approaches ∞. Then there exists ε > 0 such that if x is any positive real number, there is another positive real number x' such that x' > x and g(x) = f(x+1) - f(x) ≥ ε. Using what I like Serena (Serena Williams?) said, there exists a point σ in (x, x+1) such f '(σ) = (f(x+1) - f(x)) / (x+1 - x) = f(x+1) - f(x).

Am I almost there? Is the final step glaring me in the face? I need to go watch John Stossel for an hour, then I'll be back.

Suppose g(x) does not approach 0 as x approaches ∞. Then there exists ε > 0 such that if x is any positive real number, there is another positive real number x' such that x' > x and g(x) = f(x+1) - f(x) ≥ ε. Using what I like Serena (Serena Williams?) said, there exists a point σ in (x, x+1) such f '(σ) = (f(x+1) - f(x)) / (x+1 - x) = f(x+1) - f(x).

Am I almost there? Is the final step glaring me in the face? I need to go watch John Stossel for an hour, then I'll be back.


If you'll agree that means |f'(sigma)|>=epsilon for arbitrarily large values of sigma, I think that would rule out f'(x) approaching 0 as x->infinity, wouldn't it? Let us know when you are back from Stossel. And I really can't condone watching Fox News. May hurt your mathematical skills.
Last edited:
And I really can't condone watching Fox News.

You're silly. John Stossel is on Fox Business Network.

And yes, the proof by contradiction seems to have worked out. I'll hit you guys up with s'more questions later.

New Question:

Suppose
(a) f is continuous for x ≥ 0,
(b) f'(x) exists for x > 0,
(c) f(0) = 0,
(d) f' is monotonically increasing.
Put
and prove that g is monotonically increasing.

Proof (Attempt). By Theorem 5.11(a), it suffices to show that g'(x) ≥ 0 for all x > 0. By Theorem 5.3(c),

g'(x) = [x * f'(x) - f(x) * x'] / x2 = f'(x)/x - f'(x)/x2,​

so we need to show that
f'(x)/x - f'(x)/x2 ≥ 0 for all x > 0,​

or equivalently
f'(x) ≥ f(x)/x for all x > 0.​

I'm a little stuck now. None of theorems that I invoked require that f be continuous anywhere [though of course differentiability implies continuity, so without (a) I'd still know that f is continuous on (a, b)]. I'm sure the next step has something to do with the continuity of f. Any suggestions?

Again the Mean value theorem can come to the rescue.
Can you apply it to f(x)/x?

Btw, the specific conditions like continuity at the boundary, are preconditions to the use of the theorem.
(Perhaps you should check those in the wikipedia article to get your proof complete.)

Last edited:
Btw, the specific conditions like continuity at the boundary, are preconditions to the use of the theorem.
(Perhaps you should check those in the wikipedia article to get your proof complete.)

That's why I'm skeptical about the following proof (which I found on the internet): it invokes the Mean Value Theorem without the precondition of continuity at the boundary. What's the deal????
screen-capture-4-20.png
It is applicable in this case.
Since f is differentiable on (a,b), it is also continuous on (a,b).

Now consider the interval [x,y], which is a sub interval of (a,b).
It meets all conditions of the Mean value theorem.

It is applicable in this case.
Since f is differentiable on (a,b), it is also continuous on (a,b).

Now consider the interval [x,y], which is a sub interval of (a,b).
It meets all conditions of the Mean value theorem.


Ah, I see. Thanks so much for the clarification.
Tell me if this is a good thorough proof. My professor is very stingy, so I can't be as willy-nilly as the other proof above.
screen-capture-58.png
Tell me if this is a good thorough proof. My professor is very stingy, so I can't be as willy-nilly as the other proof above.

Even if your professor is stingy, the first part of your proof has more information than you need. For example, instead of "then f is differentiable on (x,z) and continuous on [x,z]" you can just say "then f is differentiable on [x,z]". This can make the first part of your proof a little more tidy.

The second proof is just wrong. If xn = x for all n in N, then what you wrote is just nonsense. Unlike continuity, there is not really a nice definition of differentiability in terms of convergent sequences. You are better off noting that g:im(f)→(a,b) so every element in the domain of g can be written in the form f(x) for some x in (a,b). Then just write out the criterion for differentiability of g and show that the resulting limit is 1/f'(x).

Hmm, overall it looks good, but I see a few notational issues.
Since you're asking me to nitpick, I will.

The problem statement is missing.
In particular I miss the definition of f and that f'(x) > 0 for x in (a,b).

You should write f(z) - f(x) instead of f(z) - f(z).

You should include the condition that for all n: ##x_n \ne x##, otherwise you get divisions by zero.

There should also be mention that ##f(x_n) - f(x)## can not be zero, otherwise you could get a division by zero.

I'm not familiar with the notation ##\lim_{x_n \to x}## although I guess it's not wrong.
I am used to writing ##\lim\limits_{n \to \infty}##.

You did not define g, which you should.
Apparently g is supposed to be the inverse of f, which also requires f to be invertible.

You should write ##1 \over f'(x)## instead of ##1 \over g'(x)##.

Otherwise it looks fine to me! :smile:

Last edited:
Even if your professor is stingy, the first part of your proof has more information than you need. For example, instead of "then f is differentiable on (x,z) and continuous on [x,z]" you can just say "then f is differentiable on [x,z]". This can make the first part of your proof a little more tidy.

The second proof is just wrong. If xn = x for all n in N, then what you wrote is just nonsense. Unlike continuity, there is not really a nice definition of differentiability in terms of convergent sequences. You are better off noting that g:im(f)→(a,b) so every element in the domain of g can be written in the form f(x) for some x in (a,b). Then just write out the criterion for differentiability of g and show that the resulting limit is 1/f'(x).


Your point has been duly noted.
Hmm, overall it looks good, but I see a few notational issues.
Since you're asking me to nitpick, I will.

The problem statement is missing.
In particular I miss the definition of f and that f'(x) > 0 for x in (a,b).

You should write f(z) - f(x) instead of f(z) - f(z).

You should include the condition that for all n: ##x_n \ne x##, otherwise you get divisions by zero.

There should also be mention that ##f(x_n) - f(x)## can not be zero, otherwise you could get a division by zero.

I'm not familiar with the notation ##\lim_{x_n \to x}##.
I am used to writing ##\lim\limits_{n \to \infty}##.

You did not define g, which you should.
Apparently g is supposed to be the inverse of f, which also requires f to be invertible.

You should write ##1 \over f'(x)## instead of ##1 \over g'(x)##.

Otherwise it looks fine to me! :smile:

I don't know how to write "lim" with "xn → x" below on Microsoft Word's equation editor. And now I understand why my book has 0<|x-y|<∂ in the definition of continuity; I always assumed the "0<" part was irrelevant.

Let me know what you think about the next problem.

Ex. 3. Suppose g is a real function on ℝ, with bounded derivative (say |g'|≤M). Fix ε>0, and define f(x) = x + εg'(x). Prove that f is one-to-one if ε is small enough.

Attempt:

screen-capture-2-28.png
I don't know how to write "lim" with "xn → x" below on Microsoft Word's equation editor. And now I understand why my book has 0<|x-y|<∂ in the definition of continuity; I always assumed the "0<" part was irrelevant.

Perhaps you could try lim from {n rightarrow infty}?

Yes, that is one reason for 0<|x-y|<δ, which is necessary for derivatives.

In the case of a regular limit there is another reason though.
It is possible that the limit of a function is not equal to the function value in that point.
However, in that case the limit still exists.
So the definition of a limit in general requires that 0<|x-y|<δ.

Perhaps you could try lim from {n rightarrow infty}?

Yes, that is one reason for 0<|x-y|<δ, which is necessary for derivatives.

In the case of a regular limit there is another reason though.
It is possible that the limit of a function is not equal to the function value in that point.
However, in that case the limit still exists.
So the definition of a limit in general requires that 0<|x-y|<δ.

Did you like how I used the world "thereupon"? I get sick of using "consequently", so I went to the Thesaurus to find a cool synonym.

Did you like how I used the world "thereupon"? I get sick of using "consequently", so I went to the Thesaurus to find a cool synonym.

I noticed the word and I had to smile. :)
What about "therefore" or "yields"?

I have to warn you though, English is not my native language, so I don't know what is proper.
I'd like to think that I'm better at math than at language. :wink:

Homework Statement

(I don't have my book with me, so this may not be the correct word-for-word representation of the exercise)

Suppose f(x) is differentiable on the whole real line. Show that f(x) is constant if for all real numbers x and y,


Homework Equations

Definition of a derivative

The Attempt at a Solution

f(x)-f(y) ≤ |f(x)-f(y)|, so by basic algebra we have [f(x)-f(y)]/(x-y) ≤ x - y. Letting x approach y on both sides of the inequality yields f '(x) ≤ 0.

........ Now I somehow need to show that f '(x) ≥ 0. Ideas?


I'm somehow not convinced that that represents f'(x) why not f'(y) since writing x as [tex] y+Δy [/tex] allows the Newton's quotient to be f'(y).

i.e. [tex] lim_ {\delta y->0}\frac {f(y+\triangle y)-f(y)}{y+\triangle y -y}≤ \triangle y[/tex]

Again the Mean value theorem can come to the rescue.
Can you apply it to f(x)/x?

Choose y such that 0 < y < ∞. The function f(y)/y is continuous on [0, y] and differentiable on (0, y), so there exists an x in (0, y) such that (f(x)/x)' = [f(y) - f(0)]/[y - 0] = f(y) / y. Am I close?

I'm somehow not convinced that that represents f'(x) why not f'(y) since writing x as [tex] y+Δy [/tex] allows the Newton's quotient to be f'(y).

i.e. [tex] lim_ {\delta y->0}\frac {f(y+\triangle y)-f(y)}{y+\triangle y -y}≤ \triangle y[/tex]


I worked out all the kinks. Don't worry.
screen-capture-59.png
Choose y such that 0 < y < ∞. The function f(y)/y is continuous on [0, y] and differentiable on (0, y), so there exists an x in (0, y) such that (f(x)/x)' = [f(y) - f(0)]/[y - 0] = f(y) / y. Am I close?

Close yes. :smile:
It should be: f'(x) = [f(y) - f(0)]/[y - 0] = f(y) / y
Close yes. :smile:
It should be: f'(x) = [f(y) - f(0)]/[y - 0] = f(y) / y

Got it.

Choose x with 0 < x < ∞. The function f is continuous on [0, x] and differential on (0, x). Thus, by the Mean Value Theorem, this exists an x0 with 0 < x0 < x and

f(x) = f(x) - f(0) = f '(x0) * (x - 0) = x * f '(x0).

Because f ' is monotonically increasing and x > 0, we have x * f '(x0) ≤ x * f '(x) and accordingly x * f '(x) ≥ f(x), as desired.

http://collegestudybreak.files.wordpress.com/2010/07/success.jpg [Broken]

Last edited by a moderator:

Suggested for: Show that f(x) is constant

  • Last Post
  • Last Post
  • Last Post
  • Last Post
  • Last Post
  • Last Post
  • Last Post
  • Last Post
  • Last Post
  • Last Post
  • Forums
  • Homework Help
  • Calculus and Beyond Homework Help

coshlicare.blogspot.com

Source: https://www.physicsforums.com/threads/show-that-f-x-is-constant.562214/