FAQFAQ   SearchSearch   MemberlistMemberlist   UsergroupsUsergroups 
 ProfileProfile   PreferencesPreferences   Log in to check your private messagesLog in to check your private messages   Log inLog in 
Forum index » Science and Technology » Math
Absolutely continuous, L^2 question
Post new topic   Reply to topic Page 1 of 2 [16 Posts] View previous topic :: View next topic
Goto page:  1, 2 Next
Author Message
David C. Ullrich
science forum Guru


Joined: 28 Apr 2005
Posts: 2250

PostPosted: Thu Jul 13, 2006 10:35 am    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

On Wed, 12 Jul 2006 15:39:57 EDT, James <james545@gmail.com> wrote:

Quote:
In article
5036829.1152728209310.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

In article

1820258.1152720081570.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

On Wed, 12 Jul 2006 09:00:56 EDT, James
james545@gmail.com> wrote:

This question has been killing me. Please
help
with
any insight:

Let f be absolutely continuous on [0,x] for
all
x
0. Let f, f ' be in L^2[0,oo). Let f(0) =
0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I
got
:
Prove that int_[0,x] |f f '| <= .5*
(int_[0,x] |f
'|
)^2. But I don't see how this helps for part
(b).

I have tried several ways to prove that lim
f(x)
= 0
as x --> oo.

1) I have proven that if f is in L^1[0,oo)
and f
is
uniformly continuous, then lim f(x) = 0 as x
--
oo.

Well, that fact that f' is in L^2 shows that
f is
uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

Are you asserting that f is uniformly
continuous on
all of [0,oo)? Your
argument shows that f is uniformly continuous
on
[0,x] since f is AC on
[0,x].

I don't think you know what his argument is.
(Yes, f
is UC on
[0,oo).)

Dear Wade,

I don't see it. You say there is a universal delta
on all of [0,oo)?

|f(x+h) - f(x)| = |int_[x,x+h] f' | <= ||f'||_2 *
h^(1/2) ---> 0 as h goes to
0. But to justify |f(x+h) - f(x)| = |int_[x,x+h]
f' | you need to say that f
is AC on [0,a] where a is greater than x+h. So you
have that f is uniformly
continuous on [0,a]. The fact that |f(x+h) - f(x)|
--> 0 as h ---> 0 for all
x in [0,a] doesn't give you what you want. If you
want to say that f is
uniformly continuous beyond a, then when you write
|f(x+h) - f(x)| ---> 0 as
h --> 0 (uniformly in x), this means that for all
eps > 0 there is an s > 0
(i.e. delta > 0) with |f(x+h) - f(x)| < eps for all
h < s for all x in [0,a].
But what I am saying is that if you want to check
uniform continuity at y
a, I think that this delta (or s) changes.

In any case, whether or not my babbling above makes
any sense, please share
with me how you are getting your universal delta
for the uniform continuity
of f on all of[0,oo).

If 0 <= x < y, you have |f(y) - f(x)| = |int_[x,y] f'
| <=
[int_[x,y] |f'|^2)^(1/2)]*|y-x|^(1/2) <= [int_[0,oo)
|f'|^2)^(1/2)]*|y-x|^(1/2). So there is a constant C
such that
|f(y) - f(x)| <= C*|y-x|^(1/2) for all x, y in
[0,oo).

It's really the first part of your response that I am having trouble with. You say
"If 0 <= x < y, you have |f(y) - f(x)| = |int_[x,y] f' |

It seems to me that what you are basically saying is that f is absolutely continuous on all [0,oo). In order to say that |f(y) - f(x)| = |int_[x,y] f' |, you need to first say that f(y) = int_[0,y] f' and f(x) = int[0,x] f'. In order to say those two things you need to say that f is absolutely continuous on [0,z] where z is greater than y and x. So ok, given x and y, there is a z that makes this work.

Right. So what the heck is the problem?

Quote:
If you pick x' and y', then there is a z' that makes this work. So it changes.

So what? It's still true that for any x and y

f(y) - f(x) = int_x^y f'(t) dt,

because f is AC on [x,y]. How does the fact that if you change x and y
then you change x and y affect this?

Quote:
I am missing a logical step here. Side question : If f is absolutely continuous on [0,x] for all x > 0, then does this imply that f is absolutely continuous on [0,oo)?

No. And that doesn't matter one bit - all we use above is the fact
that it's AC on [x,y].

Quote:
James


************************

David C. Ullrich
Back to top
Ronald Bruck
science forum Guru


Joined: 05 Jun 2005
Posts: 356

PostPosted: Wed Jul 12, 2006 8:10 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

In article
<30772584.1152733227614.JavaMail.jakarta@nitrogen.mathforum.org>, James
<james545@gmail.com> wrote:

Quote:
In article
5036829.1152728209310.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

In article

1820258.1152720081570.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

On Wed, 12 Jul 2006 09:00:56 EDT, James
james545@gmail.com> wrote:

This question has been killing me. Please
help
with
any insight:

Let f be absolutely continuous on [0,x] for
all
x
0. Let f, f ' be in L^2[0,oo). Let f(0) =
0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I
got
:
Prove that int_[0,x] |f f '| <= .5*
(int_[0,x] |f
'|
)^2. But I don't see how this helps for part
(b).

I have tried several ways to prove that lim
f(x)
= 0
as x --> oo.

1) I have proven that if f is in L^1[0,oo)
and f
is
uniformly continuous, then lim f(x) = 0 as x
--
oo.

Well, that fact that f' is in L^2 shows that
f is
uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

Are you asserting that f is uniformly
continuous on
all of [0,oo)? Your
argument shows that f is uniformly continuous
on
[0,x] since f is AC on
[0,x].

I don't think you know what his argument is.
(Yes, f
is UC on
[0,oo).)

Dear Wade,

I don't see it. You say there is a universal delta
on all of [0,oo)?

|f(x+h) - f(x)| = |int_[x,x+h] f' | <= ||f'||_2 *
h^(1/2) ---> 0 as h goes to
0. But to justify |f(x+h) - f(x)| = |int_[x,x+h]
f' | you need to say that f
is AC on [0,a] where a is greater than x+h. So you
have that f is uniformly
continuous on [0,a]. The fact that |f(x+h) - f(x)|
--> 0 as h ---> 0 for all
x in [0,a] doesn't give you what you want. If you
want to say that f is
uniformly continuous beyond a, then when you write
|f(x+h) - f(x)| ---> 0 as
h --> 0 (uniformly in x), this means that for all
eps > 0 there is an s > 0
(i.e. delta > 0) with |f(x+h) - f(x)| < eps for all
h < s for all x in [0,a].
But what I am saying is that if you want to check
uniform continuity at y
a, I think that this delta (or s) changes.

In any case, whether or not my babbling above makes
any sense, please share
with me how you are getting your universal delta
for the uniform continuity
of f on all of[0,oo).

If 0 <= x < y, you have |f(y) - f(x)| = |int_[x,y] f'
| <=
[int_[x,y] |f'|^2)^(1/2)]*|y-x|^(1/2) <= [int_[0,oo)
|f'|^2)^(1/2)]*|y-x|^(1/2). So there is a constant C
such that
|f(y) - f(x)| <= C*|y-x|^(1/2) for all x, y in
[0,oo).

It's really the first part of your response that I am having trouble with.
You say
"If 0 <= x < y, you have |f(y) - f(x)| = |int_[x,y] f' |

It seems to me that what you are basically saying is that f is absolutely
continuous on all [0,oo). In order to say that |f(y) - f(x)| = |int_[x,y] f'
|, you need to first say that f(y) = int_[0,y] f' and f(x) = int[0,x] f'. In
order to say those two things you need to say that f is absolutely continuous
on [0,z] where z is greater than y and x. So ok, given x and y, there is a z
that makes this work. If you pick x' and y', then there is a z' that makes this work. So it changes.

I am missing a logical step here. Side question : If f is absolutely continuous on [0,x] for all x > 0, then does this imply that f is absolutely continuous on [0,oo)?

You're straining at a gnat. You're told f is a.c. on [0,x] for all x.
That's all you need. You don't need a.c. on all of R.

It doesn't matter that x and y can change. IN THIS SUBARGUMENT, THEY
DON'T. x and y are fixed, and we claim that for THESE x and y,
f(y)-f(x) = \int_x^y f'. (And you don't need to do this by subtracting
integrals; the fact that f is a.c. on [0,y] IMPLIES, a fortiori, that f
is a.c. on [x,y] (look at the DEFINITION of a.c.); hence that f(y)-f(x)
= \int_x^y f'.)

--
Ron Bruck

Posted Via Usenet.com Premium Usenet Newsgroup Services
----------------------------------------------------------
** SPEED ** RETENTION ** COMPLETION ** ANONYMITY **
----------------------------------------------------------
http://www.usenet.com
Back to top
The World Wide Wade
science forum Guru


Joined: 24 Mar 2005
Posts: 790

PostPosted: Wed Jul 12, 2006 8:04 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

In article
<30772584.1152733227614.JavaMail.jakarta@nitrogen.mathforum.org>,
James <james545@gmail.com> wrote:

Quote:
In article
5036829.1152728209310.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

In article

1820258.1152720081570.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

On Wed, 12 Jul 2006 09:00:56 EDT, James
james545@gmail.com> wrote:

This question has been killing me. Please
help
with
any insight:

Let f be absolutely continuous on [0,x] for
all
x
0. Let f, f ' be in L^2[0,oo). Let f(0) =
0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I
got
:
Prove that int_[0,x] |f f '| <= .5*
(int_[0,x] |f
'|
)^2. But I don't see how this helps for part
(b).

I have tried several ways to prove that lim
f(x)
= 0
as x --> oo.

1) I have proven that if f is in L^1[0,oo)
and f
is
uniformly continuous, then lim f(x) = 0 as x
--
oo.

Well, that fact that f' is in L^2 shows that
f is
uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

Are you asserting that f is uniformly
continuous on
all of [0,oo)? Your
argument shows that f is uniformly continuous
on
[0,x] since f is AC on
[0,x].

I don't think you know what his argument is.
(Yes, f
is UC on
[0,oo).)

Dear Wade,

I don't see it. You say there is a universal delta
on all of [0,oo)?

|f(x+h) - f(x)| = |int_[x,x+h] f' | <= ||f'||_2 *
h^(1/2) ---> 0 as h goes to
0. But to justify |f(x+h) - f(x)| = |int_[x,x+h]
f' | you need to say that f
is AC on [0,a] where a is greater than x+h. So you
have that f is uniformly
continuous on [0,a]. The fact that |f(x+h) - f(x)|
--> 0 as h ---> 0 for all
x in [0,a] doesn't give you what you want. If you
want to say that f is
uniformly continuous beyond a, then when you write
|f(x+h) - f(x)| ---> 0 as
h --> 0 (uniformly in x), this means that for all
eps > 0 there is an s > 0
(i.e. delta > 0) with |f(x+h) - f(x)| < eps for all
h < s for all x in [0,a].
But what I am saying is that if you want to check
uniform continuity at y
a, I think that this delta (or s) changes.

In any case, whether or not my babbling above makes
any sense, please share
with me how you are getting your universal delta
for the uniform continuity
of f on all of[0,oo).

If 0 <= x < y, you have |f(y) - f(x)| = |int_[x,y] f'
| <=
[int_[x,y] |f'|^2)^(1/2)]*|y-x|^(1/2) <= [int_[0,oo)
|f'|^2)^(1/2)]*|y-x|^(1/2). So there is a constant C
such that
|f(y) - f(x)| <= C*|y-x|^(1/2) for all x, y in
[0,oo).

It's really the first part of your response that I am having trouble with.
You say
"If 0 <= x < y, you have |f(y) - f(x)| = |int_[x,y] f' |

It seems to me that what you are basically saying is that f is absolutely
continuous on all [0,oo). In order to say that |f(y) - f(x)| = |int_[x,y] f'
|, you need to first say that f(y) = int_[0,y] f' and f(x) = int[0,x] f'. In
order to say those two things you need to say that f is absolutely continuous
on [0,z] where z is greater than y and x. So ok, given x and y, there is a z
that makes this work. If you pick x' and y', then there is a z' that makes
this work. So it changes.

f is AC on every bounded subinterval [a,b] of [0,oo). Therefore
f(b) - f(a) = int_[a,b] f' for every bounded subinterval [a,b] of
[0,oo). I'm not sure how to make this clearer; I'm just using the
central basic theorem as it is stated.


Quote:
Side question : If f is absolutely
continuous on [0,x] for all x > 0, then does this imply that f is absolutely
continuous on [0,oo)?

Certainly not.
Back to top
James1118
science forum Guru Wannabe


Joined: 04 Feb 2005
Posts: 154

PostPosted: Wed Jul 12, 2006 7:54 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

Quote:
In article
5036829.1152728209310.JavaMail.jakarta@nitrogen.mathf
orum.org>, James
james545@gmail.com> wrote:

In article

1820258.1152720081570.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

On Wed, 12 Jul 2006 09:00:56 EDT, James
james545@gmail.com> wrote:

This question has been killing me. Please
help
with
any insight:

Let f be absolutely continuous on [0,x] for
all
x
0. Let f, f ' be in L^2[0,oo). Let f(0) =
0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I
got
:
Prove that int_[0,x] |f f '| <= .5*
(int_[0,x] |f
'|
)^2. But I don't see how this helps for part
(b).

I have tried several ways to prove that lim
f(x)
= 0
as x --> oo.

1) I have proven that if f is in L^1[0,oo)
and f
is
uniformly continuous, then lim f(x) = 0 as x
--
oo.

Well, that fact that f' is in L^2 shows that
f is
uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

Are you asserting that f is uniformly
continuous on
all of [0,oo)? Your
argument shows that f is uniformly continuous
on
[0,x] since f is AC on
[0,x].

I don't think you know what his argument is.
(Yes, f
is UC on
[0,oo).)

Dear Wade,

I don't see it. You say there is a universal delta
on all of [0,oo)?

|f(x+h) - f(x)| = |int_[x,x+h] f' | <= ||f'||_2 *
h^(1/2) ---> 0 as h goes to
|0. But to justify |f(x+h) - f(x)| = |int_[x,x+h]
f' | you need to say that
|f is AC on [0,a] where a is greater than x+h. So
you have that f is
|uniformly continuous on [0,a]. The fact that
|f(x+h) - f(x)| --> 0 as h
|---> 0 for all x in [0,a] doesn't give you what
you want. If you want to
|say that f is uniformly continuous beyond a, then
when you write |f(x+h) -
|f(x)| ---> 0 as h --> 0 (uniformly in x), this
means that for all eps > 0
|there is an s > 0 (i.e. delta > 0) with |f(x+h) -
f(x)| < eps for all h < s
|for all x in [0,a]. But what I am saying is that
if you want to check
|uniform continuity at y > a, I think that this
delta (or s) changes.

In any case, whether or not my babbling above makes
any sense, please share with me how you are getting
your universal delta for the uniform continuity of f
on all of[0,oo).

Oh crap, I'm sick and tired of this thread. Look:
for 0 < x < y,

|f(y)^2 - f(x)^2| = |\int_x^y 2 f(t) f'(t) dt|

(I'll leave the fact that f^2 is ac, hence (f^2)' =
2ff' a.e. to you),

= (\int_x^y |f(t)|^2 dt)^(1/2) (\int_x^y
int_x^y |f'(t)|^2 dt)^(1/2)

= (\int_x^\infty |f|^2)^(1/2) (\int_x^\infty
^\infty |f'|^2)^(1/2)

and here's the point: if g \in L^2, then
\int_x^\infty |g|^2 --> 0 as
x --> infinity. From this you should see that f(x)
is a Cauchy net as
x --> infinity, hence has a limit. Now what kind of
limits can an L^2
function have?


Thank you, and you probably mean "(f(x))^2 is a Cauchy net as x ---> infinity".

Quote:
--
Ron Bruck

Posted Via Usenet.com Premium Usenet Newsgroup
p Services
------------------------------------------------------
----
** SPEED ** RETENTION ** COMPLETION ** ANONYMITY
MITY **
------------------------------------------------------
----
http://www.usenet.com
Back to top
Ronald Bruck
science forum Guru


Joined: 05 Jun 2005
Posts: 356

PostPosted: Wed Jul 12, 2006 7:41 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

In article
<5036829.1152728209310.JavaMail.jakarta@nitrogen.mathforum.org>, James
<james545@gmail.com> wrote:

Quote:
In article
1820258.1152720081570.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

On Wed, 12 Jul 2006 09:00:56 EDT, James
james545@gmail.com> wrote:

This question has been killing me. Please help
with
any insight:

Let f be absolutely continuous on [0,x] for all
x
0. Let f, f ' be in L^2[0,oo). Let f(0) = 0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I got
:
Prove that int_[0,x] |f f '| <= .5* (int_[0,x] |f
'|
)^2. But I don't see how this helps for part
(b).

I have tried several ways to prove that lim f(x)
= 0
as x --> oo.

1) I have proven that if f is in L^1[0,oo) and f
is
uniformly continuous, then lim f(x) = 0 as x --
oo.

Well, that fact that f' is in L^2 shows that f is
uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

Are you asserting that f is uniformly continuous on
all of [0,oo)? Your
argument shows that f is uniformly continuous on
[0,x] since f is AC on
[0,x].

I don't think you know what his argument is. (Yes, f
is UC on
[0,oo).)

Dear Wade,

I don't see it. You say there is a universal delta on all of [0,oo)?

|f(x+h) - f(x)| = |int_[x,x+h] f' | <= ||f'||_2 * h^(1/2) ---> 0 as h goes to
|0. But to justify |f(x+h) - f(x)| = |int_[x,x+h] f' | you need to say that
|f is AC on [0,a] where a is greater than x+h. So you have that f is
|uniformly continuous on [0,a]. The fact that |f(x+h) - f(x)| --> 0 as h
|---> 0 for all x in [0,a] doesn't give you what you want. If you want to
|say that f is uniformly continuous beyond a, then when you write |f(x+h) -
|f(x)| ---> 0 as h --> 0 (uniformly in x), this means that for all eps > 0
|there is an s > 0 (i.e. delta > 0) with |f(x+h) - f(x)| < eps for all h < s
|for all x in [0,a]. But what I am saying is that if you want to check
|uniform continuity at y > a, I think that this delta (or s) changes.

In any case, whether or not my babbling above makes any sense, please share with me how you are getting your universal delta for the uniform continuity of f on all of[0,oo).

Oh crap, I'm sick and tired of this thread. Look: for 0 < x < y,

|f(y)^2 - f(x)^2| = |\int_x^y 2 f(t) f'(t) dt|

(I'll leave the fact that f^2 is ac, hence (f^2)' = 2ff' a.e. to you),

<= (\int_x^y |f(t)|^2 dt)^(1/2) (\int_x^y |f'(t)|^2 dt)^(1/2)

<= (\int_x^\infty |f|^2)^(1/2) (\int_x^\infty |f'|^2)^(1/2)

and here's the point: if g \in L^2, then \int_x^\infty |g|^2 --> 0 as
x --> infinity. From this you should see that f(x) is a Cauchy net as
x --> infinity, hence has a limit. Now what kind of limits can an L^2
function have?

--
Ron Bruck

Posted Via Usenet.com Premium Usenet Newsgroup Services
----------------------------------------------------------
** SPEED ** RETENTION ** COMPLETION ** ANONYMITY **
----------------------------------------------------------
http://www.usenet.com
Back to top
James1118
science forum Guru Wannabe


Joined: 04 Feb 2005
Posts: 154

PostPosted: Wed Jul 12, 2006 7:39 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

Quote:
In article
5036829.1152728209310.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

In article

1820258.1152720081570.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

On Wed, 12 Jul 2006 09:00:56 EDT, James
james545@gmail.com> wrote:

This question has been killing me. Please
help
with
any insight:

Let f be absolutely continuous on [0,x] for
all
x
0. Let f, f ' be in L^2[0,oo). Let f(0) =
0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I
got
:
Prove that int_[0,x] |f f '| <= .5*
(int_[0,x] |f
'|
)^2. But I don't see how this helps for part
(b).

I have tried several ways to prove that lim
f(x)
= 0
as x --> oo.

1) I have proven that if f is in L^1[0,oo)
and f
is
uniformly continuous, then lim f(x) = 0 as x
--
oo.

Well, that fact that f' is in L^2 shows that
f is
uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

Are you asserting that f is uniformly
continuous on
all of [0,oo)? Your
argument shows that f is uniformly continuous
on
[0,x] since f is AC on
[0,x].

I don't think you know what his argument is.
(Yes, f
is UC on
[0,oo).)

Dear Wade,

I don't see it. You say there is a universal delta
on all of [0,oo)?

|f(x+h) - f(x)| = |int_[x,x+h] f' | <= ||f'||_2 *
h^(1/2) ---> 0 as h goes to
0. But to justify |f(x+h) - f(x)| = |int_[x,x+h]
f' | you need to say that f
is AC on [0,a] where a is greater than x+h. So you
have that f is uniformly
continuous on [0,a]. The fact that |f(x+h) - f(x)|
--> 0 as h ---> 0 for all
x in [0,a] doesn't give you what you want. If you
want to say that f is
uniformly continuous beyond a, then when you write
|f(x+h) - f(x)| ---> 0 as
h --> 0 (uniformly in x), this means that for all
eps > 0 there is an s > 0
(i.e. delta > 0) with |f(x+h) - f(x)| < eps for all
h < s for all x in [0,a].
But what I am saying is that if you want to check
uniform continuity at y
a, I think that this delta (or s) changes.

In any case, whether or not my babbling above makes
any sense, please share
with me how you are getting your universal delta
for the uniform continuity
of f on all of[0,oo).

If 0 <= x < y, you have |f(y) - f(x)| = |int_[x,y] f'
| <=
[int_[x,y] |f'|^2)^(1/2)]*|y-x|^(1/2) <= [int_[0,oo)
|f'|^2)^(1/2)]*|y-x|^(1/2). So there is a constant C
such that
|f(y) - f(x)| <= C*|y-x|^(1/2) for all x, y in
[0,oo).

It's really the first part of your response that I am having trouble with. You say
"If 0 <= x < y, you have |f(y) - f(x)| = |int_[x,y] f' |

It seems to me that what you are basically saying is that f is absolutely continuous on all [0,oo). In order to say that |f(y) - f(x)| = |int_[x,y] f' |, you need to first say that f(y) = int_[0,y] f' and f(x) = int[0,x] f'. In order to say those two things you need to say that f is absolutely continuous on [0,z] where z is greater than y and x. So ok, given x and y, there is a z that makes this work. If you pick x' and y', then there is a z' that makes this work. So it changes.

I am missing a logical step here. Side question : If f is absolutely continuous on [0,x] for all x > 0, then does this imply that f is absolutely continuous on [0,oo)?

James
Back to top
The World Wide Wade
science forum Guru


Joined: 24 Mar 2005
Posts: 790

PostPosted: Wed Jul 12, 2006 7:13 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

In article
<5036829.1152728209310.JavaMail.jakarta@nitrogen.mathforum.org>,
James <james545@gmail.com> wrote:

Quote:
In article
1820258.1152720081570.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

On Wed, 12 Jul 2006 09:00:56 EDT, James
james545@gmail.com> wrote:

This question has been killing me. Please help
with
any insight:

Let f be absolutely continuous on [0,x] for all
x
0. Let f, f ' be in L^2[0,oo). Let f(0) = 0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I got
:
Prove that int_[0,x] |f f '| <= .5* (int_[0,x] |f
'|
)^2. But I don't see how this helps for part
(b).

I have tried several ways to prove that lim f(x)
= 0
as x --> oo.

1) I have proven that if f is in L^1[0,oo) and f
is
uniformly continuous, then lim f(x) = 0 as x --
oo.

Well, that fact that f' is in L^2 shows that f is
uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

Are you asserting that f is uniformly continuous on
all of [0,oo)? Your
argument shows that f is uniformly continuous on
[0,x] since f is AC on
[0,x].

I don't think you know what his argument is. (Yes, f
is UC on
[0,oo).)

Dear Wade,

I don't see it. You say there is a universal delta on all of [0,oo)?

|f(x+h) - f(x)| = |int_[x,x+h] f' | <= ||f'||_2 * h^(1/2) ---> 0 as h goes to
0. But to justify |f(x+h) - f(x)| = |int_[x,x+h] f' | you need to say that f
is AC on [0,a] where a is greater than x+h. So you have that f is uniformly
continuous on [0,a]. The fact that |f(x+h) - f(x)| --> 0 as h ---> 0 for all
x in [0,a] doesn't give you what you want. If you want to say that f is
uniformly continuous beyond a, then when you write |f(x+h) - f(x)| ---> 0 as
h --> 0 (uniformly in x), this means that for all eps > 0 there is an s > 0
(i.e. delta > 0) with |f(x+h) - f(x)| < eps for all h < s for all x in [0,a].
But what I am saying is that if you want to check uniform continuity at y
a, I think that this delta (or s) changes.

In any case, whether or not my babbling above makes any sense, please share
with me how you are getting your universal delta for the uniform continuity
of f on all of[0,oo).

If 0 <= x < y, you have |f(y) - f(x)| = |int_[x,y] f' | <=
[int_[x,y] |f'|^2)^(1/2)]*|y-x|^(1/2) <= [int_[0,oo)
|f'|^2)^(1/2)]*|y-x|^(1/2). So there is a constant C such that
|f(y) - f(x)| <= C*|y-x|^(1/2) for all x, y in [0,oo).
Back to top
James1118
science forum Guru Wannabe


Joined: 04 Feb 2005
Posts: 154

PostPosted: Wed Jul 12, 2006 6:16 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

Quote:
In article
1820258.1152720081570.JavaMail.jakarta@nitrogen.mathf
orum.org>,
James <james545@gmail.com> wrote:

On Wed, 12 Jul 2006 09:00:56 EDT, James
james545@gmail.com> wrote:

This question has been killing me. Please help
with
any insight:

Let f be absolutely continuous on [0,x] for all
x
0. Let f, f ' be in L^2[0,oo). Let f(0) = 0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I got
:
Prove that int_[0,x] |f f '| <= .5* (int_[0,x] |f
'|
)^2. But I don't see how this helps for part
(b).

I have tried several ways to prove that lim f(x)
= 0
as x --> oo.

1) I have proven that if f is in L^1[0,oo) and f
is
uniformly continuous, then lim f(x) = 0 as x --
oo.

Well, that fact that f' is in L^2 shows that f is
uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

Are you asserting that f is uniformly continuous on
all of [0,oo)? Your
argument shows that f is uniformly continuous on
[0,x] since f is AC on
[0,x].

I don't think you know what his argument is. (Yes, f
is UC on
[0,oo).)

Dear Wade,

I don't see it. You say there is a universal delta on all of [0,oo)?

|f(x+h) - f(x)| = |int_[x,x+h] f' | <= ||f'||_2 * h^(1/2) ---> 0 as h goes to 0. But to justify |f(x+h) - f(x)| = |int_[x,x+h] f' | you need to say that f is AC on [0,a] where a is greater than x+h. So you have that f is uniformly continuous on [0,a]. The fact that |f(x+h) - f(x)| --> 0 as h ---> 0 for all x in [0,a] doesn't give you what you want. If you want to say that f is uniformly continuous beyond a, then when you write |f(x+h) - f(x)| ---> 0 as h --> 0 (uniformly in x), this means that for all eps > 0 there is an s > 0 (i.e. delta > 0) with |f(x+h) - f(x)| < eps for all h < s for all x in [0,a]. But what I am saying is that if you want to check uniform continuity at y > a, I think that this delta (or s) changes.

In any case, whether or not my babbling above makes any sense, please share with me how you are getting your universal delta for the uniform continuity of f on all of[0,oo).

Thank you,

James
Back to top
The World Wide Wade
science forum Guru


Joined: 24 Mar 2005
Posts: 790

PostPosted: Wed Jul 12, 2006 5:49 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

In article
<1820258.1152720081570.JavaMail.jakarta@nitrogen.mathforum.org>,
James <james545@gmail.com> wrote:

Quote:
On Wed, 12 Jul 2006 09:00:56 EDT, James
james545@gmail.com> wrote:

This question has been killing me. Please help with
any insight:

Let f be absolutely continuous on [0,x] for all x
0. Let f, f ' be in L^2[0,oo). Let f(0) = 0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I got :
Prove that int_[0,x] |f f '| <= .5* (int_[0,x] |f '|
)^2. But I don't see how this helps for part (b).

I have tried several ways to prove that lim f(x) = 0
as x --> oo.

1) I have proven that if f is in L^1[0,oo) and f is
uniformly continuous, then lim f(x) = 0 as x --> oo.

Well, that fact that f' is in L^2 shows that f is
uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

Are you asserting that f is uniformly continuous on all of [0,oo)? Your
argument shows that f is uniformly continuous on [0,x] since f is AC on
[0,x].

I don't think you know what his argument is. (Yes, f is UC on
[0,oo).)
Back to top
James1118
science forum Guru Wannabe


Joined: 04 Feb 2005
Posts: 154

PostPosted: Wed Jul 12, 2006 4:00 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

Quote:
On Wed, 12 Jul 2006 09:00:56 EDT, James
james545@gmail.com> wrote:

This question has been killing me. Please help with
any insight:

Let f be absolutely continuous on [0,x] for all x
0. Let f, f ' be in L^2[0,oo). Let f(0) = 0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I got :
Prove that int_[0,x] |f f '| <= .5* (int_[0,x] |f '|
)^2. But I don't see how this helps for part (b).

I have tried several ways to prove that lim f(x) = 0
as x --> oo.

1) I have proven that if f is in L^1[0,oo) and f is
uniformly continuous, then lim f(x) = 0 as x --> oo.

Well, that fact that f' is in L^2 shows that f is
uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

Are you asserting that f is uniformly continuous on all of [0,oo)? Your argument shows that f is uniformly continuous on [0,x] since f is AC on [0,x].

Quote:

Unfortunately our f is absolutely continuous only
on [0,x] for all x > 0, but I think I'd need it to be
on [0,oo). Even then, it is f^2 that would need to
be uniformly continuous to imply that f^2 is in
L^1[0,oo), so f^2 goes to 0.

2) f is absolutely continuous, so lim f(x) = lim
int_[0,x] f ' = int_[0,oo] f '. I couldn't show that
the latter is 0.

I would appreciate any help you can give,

James


************************

David C. Ullrich
Back to top
eugene
science forum Guru


Joined: 24 Nov 2005
Posts: 331

PostPosted: Wed Jul 12, 2006 3:05 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

James wrote:
Quote:

James wrote:
This question has been killing me. Please help
with any insight:

Let f be absolutely continuous on [0,x] for all x
0. Let f, f ' be in L^2[0,oo). Let f(0) = 0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I got :
Prove that int_[0,x] |f f '| <= .5* (int_[0,x] |f '|
)^2. But I don't see how this helps for part (b).

I have tried several ways to prove that lim f(x) =
0 as x --> oo.

1) I have proven that if f is in L^1[0,oo) and f is
uniformly continuous, then lim f(x) = 0 as x --> oo.
Unfortunately our f is absolutely continuous only on
n [0,x] for all x > 0, but I think I'd need it to be
on [0,oo). Even then, it is f^2 that would need to
be uniformly continuous to imply that f^2 is in
L^1[0,oo), so f^2 goes to 0.

2) f is absolutely continuous, so lim f(x) = lim
int_[0,x] f ' = int_[0,oo] f '. I couldn't show that
the latter is 0.

I would appreciate any help you can give,

James

Just a remark: Doesn't it folows from that f,f' in
L^2 [0,oo] that f
and f' are bounded and from |f(x) - f(y)| <= sup_R
|f'| (x-y), sup|f|
oo you get that f^2 is uniformly continuous ?


Why does it follow that f,f' are bounded? I also wouldn't see why f^2 is uniformly continuous in your case either.

Yes, you'd better follow David :
|f(x+h) - f(x) | <= int_[x,x+h] |f'(t)| dt <= || f' ||_2 * sqrt(h)
which implies the uniform continuity, form which for any given e > 0
you can chose a > 0 such that
| f(x) -f(y) | < e whenever |x-y| < a, so there exist a > 0 such that
|f(x)| = | f(x) - f(0) | < e whenever |x| < a .
Back to top
James1118
science forum Guru Wannabe


Joined: 04 Feb 2005
Posts: 154

PostPosted: Wed Jul 12, 2006 2:24 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

Quote:

James wrote:
This question has been killing me. Please help
with any insight:

Let f be absolutely continuous on [0,x] for all x
0. Let f, f ' be in L^2[0,oo). Let f(0) = 0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I got :
Prove that int_[0,x] |f f '| <= .5* (int_[0,x] |f '|
)^2. But I don't see how this helps for part (b).

I have tried several ways to prove that lim f(x) =
0 as x --> oo.

1) I have proven that if f is in L^1[0,oo) and f is
uniformly continuous, then lim f(x) = 0 as x --> oo.
Unfortunately our f is absolutely continuous only on
n [0,x] for all x > 0, but I think I'd need it to be
on [0,oo). Even then, it is f^2 that would need to
be uniformly continuous to imply that f^2 is in
L^1[0,oo), so f^2 goes to 0.

2) f is absolutely continuous, so lim f(x) = lim
int_[0,x] f ' = int_[0,oo] f '. I couldn't show that
the latter is 0.

I would appreciate any help you can give,

James

Just a remark: Doesn't it folows from that f,f' in
L^2 [0,oo] that f
and f' are bounded and from |f(x) - f(y)| <= sup_R
|f'| (x-y), sup|f|
oo you get that f^2 is uniformly continuous ?


Why does it follow that f,f' are bounded? I also wouldn't see why f^2 is uniformly continuous in your case either.
Back to top
James1118
science forum Guru Wannabe


Joined: 04 Feb 2005
Posts: 154

PostPosted: Wed Jul 12, 2006 2:17 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

Quote:
On Wed, 12 Jul 2006 09:00:56 EDT, James
james545@gmail.com> wrote:

This question has been killing me. Please help with
any insight:

Let f be absolutely continuous on [0,x] for all x
0. Let f, f ' be in L^2[0,oo). Let f(0) = 0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I got :
Prove that int_[0,x] |f f '| <= .5* (int_[0,x] |f '|
)^2. But I don't see how this helps for part (b).

I have tried several ways to prove that lim f(x) = 0
as x --> oo.

1) I have proven that if f is in L^1[0,oo) and f is
uniformly continuous, then lim f(x) = 0 as x --> oo.

Well, that fact that f' is in L^2 shows that f is
uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

I already knew that f is uniformly continuous. It is assumed that f is absolutely continuous. That isn't my problem...

Quote:

Unfortunately our f is absolutely continuous only
on [0,x] for all x > 0, but I think I'd need it to be
on [0,oo). Even then, it is f^2 that would need to
be uniformly continuous to imply that f^2 is in
L^1[0,oo), so f^2 goes to 0.

2) f is absolutely continuous, so lim f(x) = lim
int_[0,x] f ' = int_[0,oo] f '. I couldn't show that
the latter is 0.

I would appreciate any help you can give,

James


************************

David C. Ullrich
Back to top
David C. Ullrich
science forum Guru


Joined: 28 Apr 2005
Posts: 2250

PostPosted: Wed Jul 12, 2006 2:02 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

On Wed, 12 Jul 2006 09:00:56 EDT, James <james545@gmail.com> wrote:

Quote:
This question has been killing me. Please help with any insight:

Let f be absolutely continuous on [0,x] for all x > 0. Let f, f ' be in L^2[0,oo). Let f(0) = 0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I got : Prove that int_[0,x] |f f '| <= .5* (int_[0,x] |f '| )^2. But I don't see how this helps for part (b).

I have tried several ways to prove that lim f(x) = 0 as x --> oo.

1) I have proven that if f is in L^1[0,oo) and f is uniformly continuous, then lim f(x) = 0 as x --> oo.

Well, that fact that f' is in L^2 shows that f is uniformly
continuous:

|f(x+h) - f(x)| = ______ <= _______,

which tends to 0 as h -> 0, uniformly in x.

Quote:
Unfortunately our f is absolutely continuous only on [0,x] for all x > 0, but I think I'd need it to be on [0,oo). Even then, it is f^2 that would need to be uniformly continuous to imply that f^2 is in L^1[0,oo), so f^2 goes to 0.

2) f is absolutely continuous, so lim f(x) = lim int_[0,x] f ' = int_[0,oo] f '. I couldn't show that the latter is 0.

I would appreciate any help you can give,

James


************************

David C. Ullrich
Back to top
eugene
science forum Guru


Joined: 24 Nov 2005
Posts: 331

PostPosted: Wed Jul 12, 2006 1:29 pm    Post subject: Re: Absolutely continuous, L^2 question Reply with quote

James wrote:
Quote:
This question has been killing me. Please help with any insight:

Let f be absolutely continuous on [0,x] for all x > 0. Let f, f ' be in L^2[0,oo). Let f(0) = 0.

Prove that

lim f(x) = 0
x --> oo

There was a part (a) to this problem that I got : Prove that int_[0,x] |f f '| <= .5* (int_[0,x] |f '| )^2. But I don't see how this helps for part (b).

I have tried several ways to prove that lim f(x) = 0 as x --> oo.

1) I have proven that if f is in L^1[0,oo) and f is uniformly continuous, then lim f(x) = 0 as x --> oo. Unfortunately our f is absolutely continuous only on [0,x] for all x > 0, but I think I'd need it to be on [0,oo). Even then, it is f^2 that would need to be uniformly continuous to imply that f^2 is in L^1[0,oo), so f^2 goes to 0.

2) f is absolutely continuous, so lim f(x) = lim int_[0,x] f ' = int_[0,oo] f '. I couldn't show that the latter is 0.

I would appreciate any help you can give,

James

Just a remark: Doesn't it folows from that f,f' in L^2 [0,oo] that f
and f' are bounded and from |f(x) - f(y)| <= sup_R |f'| (x-y), sup|f| <
oo you get that f^2 is uniformly continuous ?
Back to top
Google

Back to top
Display posts from previous:   
Post new topic   Reply to topic Page 1 of 2 [16 Posts] Goto page:  1, 2 Next
View previous topic :: View next topic
The time now is Tue Oct 23, 2018 1:15 am | All times are GMT
Forum index » Science and Technology » Math
Jump to:  

Similar Topics
Topic Author Forum Replies Last Post
No new posts Question about Life. socratus Probability 0 Sun Jan 06, 2008 10:01 pm
No new posts Probability Question dumont Probability 0 Mon Oct 23, 2006 3:38 pm
No new posts Question about exponention WingDragon@gmail.com Math 2 Fri Jul 21, 2006 8:13 am
No new posts question on solartron 1260 carrie_yao@hotmail.com Electrochem 0 Fri Jul 21, 2006 7:11 am
No new posts A Combinatorics/Graph Theory Question mathlover Undergraduate 1 Wed Jul 19, 2006 11:30 pm

Copyright © 2004-2005 DeniX Solutions SRL
Other DeniX Solutions sites: Electronics forum |  Medicine forum |  Unix/Linux blog |  Unix/Linux documentation |  Unix/Linux forums  |  send newsletters
 


Powered by phpBB © 2001, 2005 phpBB Group
[ Time: 0.0306s ][ Queries: 16 (0.0042s) ][ GZIP on - Debug on ]