设f(x)在[0,1]上具有二阶连续导数,且|f✀✀(x)|<=A,x∈[0,1],证明|f✀(x)|<=|f(1)-f(0)|+A⼀2,x∈[0,1]

忘高人指点,只有20分了
2025-01-03 19:28:19
推荐回答(1个)
回答1:

f(0)=f(x)+f'(x)(0-x)+0.5f''(a)(0-x)^2
f(1)=f(x)+f'(x)(1-x)+0.5f''(b)(1-x)^2
两式相减,移项,取绝对值得|f'(x)|=|f(1)-f(0)+0.5f''(a)x^2-0.5f''(b)(1-x)^2|<=
|f(1)-f(0)|+0.5A(x^2+(1-x)^2)<=|f(1)-f(0)|+0.5A,最后不等式是因为二次函数x^2+(1-x)^2在【0 1】上的最大值是1