设f(x)在[0,1]上具有二阶连续导数,且|f''(x)|<=A,x∈[0,1],证明|f'(x)|<=|f(1)-f(0)|+A/2,x∈[0,1]
展开全部
f(0)=f(x)+f'(x)(0-x)+0.5f''(a)(0-x)^2
f(1)=f(x)+f'(x)(1-x)+0.5f''(b)(1-x)^2
两式相减,移项,取绝对值得|f'(x)|=|f(1)-f(0)+0.5f''(a)x^2-0.5f''(b)(1-x)^2|<=
|f(1)-f(0)|+0.5A(x^2+(1-x)^2)<=|f(1)-f(0)|+0.5A,最后不等式是因为二次函数x^2+(1-x)^2在【0 1】上的最大值是1
f(1)=f(x)+f'(x)(1-x)+0.5f''(b)(1-x)^2
两式相减,移项,取绝对值得|f'(x)|=|f(1)-f(0)+0.5f''(a)x^2-0.5f''(b)(1-x)^2|<=
|f(1)-f(0)|+0.5A(x^2+(1-x)^2)<=|f(1)-f(0)|+0.5A,最后不等式是因为二次函数x^2+(1-x)^2在【0 1】上的最大值是1
推荐律师服务:
若未解决您的问题,请您详细描述您的问题,通过百度律临进行免费专业咨询