The popular vector operator ▽ . ▽ = ▽ 2 is known as the Laplacian operator.
Is it possible to invent a new vector operator ▽ X ▽ which isn't known yet?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
The answer is NO! Since ▽ X ▽ = 0 ( This is similar to vector operation A → X A → = 0 → )
This is no solution. The operator you are referring to is c u r l ( g r a d ( . ) ) . And no you cannot say that as for any two vectors A × A = 0 that indeed ∇ × ∇ is a zero operator. This is absurd because as I have said in my solution , the commutativity of the mixed partial derivative of 2nd order is necessary . Please elaborate your solution. And the dot in your ∇ ⋅ ∇ is not the same as the dot in the dot product between vectors. It is just a notation. You cannot multiply operators in the same way you do with vectors or scalors. ∇ ⋅ ∇ means we are finding the gradient of the scalor function first and then calculating the divergence of it. Whereas ∇ × ∇ means we are finding the gradient of the scalor function first and then calculating the curl of it . Please try and elaborate a solution as much as possible so as to leave no ambiguity. Otherwise it might lead to some learners developing misconceptions about basic things.
Log in to reply
That one is a Laplacian operator and not first gradient and then divergence, I was trying to show why we cannot have del x del operator unlike the Laplacian operator, it didn't imply that first gradient and then curl. Well this solution and the question I posted is two years back and by now I have a better understanding of operators so I agree that I cannot compare vectors with operators in any sense. Have to delete this solution or come up with a better solution. Anyway, thanks!
Log in to reply
Well I do not want to come off as arrogant but the defintion of laplacian operator is precisely the divergence of gradient of a function(read the first line of the wiki entry) . I will just post a few lines from the wikipedia entry of Laplace operator. "The Laplace operator is a second-order differential operator in the n-dimensional Euclidean space, defined as the divergence (∇·) of the gradient (∇f ). Thus if f is a twice-differentiable real-valued function, then the Laplacian of f is defined by The Laplace operator is a second-order differential operator in the n-dimensional Euclidean space, defined as the divergence (∇·) of the gradient (∇f ). Thus if f is a twice-differentiable real-valued function, then the Laplacian of f is defined by:
Δ f = ∇ 2 f = ∇ ⋅ ∇ f Δ f = ∇ 2 f = ∇ ⋅ ∇ f "
Here's the link to the wiki :- https://en.wikipedia.org/wiki/Laplace_operator.
But I am unsure as to which laplacian operator you are referring to. There is the scalor laplacian operator which is nothing but the divergence of grad of a function . And there is a vector laplacian operator(which is less used) which is really very similar to the scalor form except you apply it to each of the rectangular components. In fact vector laplacian is a generalization of scalar laplacian which again is nothing except div(grad(.)) .
And operators are just symbolic notation of some operation. So ∇ × ∇ has no meaning except we are subjecting the function to the gradient operator first which is ∇ ( . ) and then the curl operator which is ∇ × which in totality gives ∇ × ∇ . But the × in ∇ × and ∇ × ∇ is not the vector cross product in any sense...It is just merely similar to it. Because for i ^ ∂ x ∂ is neither a vector and nor a rectangular component ,....so it is just a notation and not in reality a vector. There is nothing wrong with defining such an operator but the problem is that when ever you do calculate the curl of gradient of a function you end up with it being identically 0. so ∇ × ∇ is nothing but a zero operator. It gives you no value except 0 identically for every function(providing they satisfy the commutativity of 2nd order partial derivatives). Hence it is just counter-intuitive to define an operator which gives you zero identically.
Popularly ∇ × ∇ and the laplacian is defined for scalar function . Now ∇ × ∇ may also be applied to a vector function but we have to use dyadics and unit dyads for it's calculation. But in any ways curl grad would always give zero. and curl grad is precisely the defintion of ∇ × ∇ be it for vector or scalar function.
Log in to reply
@Arghyadeep Chatterjee – You don't at all sound arrogant! Thank you so much for the explanation, you elaborated it so well!
Problem Loading...
Note Loading...
Set Loading...
c u r l ( g r a d ( A ) ) = 0 for all continuous functions A .
c u r l ( ∂ x ∂ A i ^ + ∂ y ∂ A j ^ + ∂ z ∂ A k ^ )
= ( ∂ y ∂ z ∂ 2 A − ∂ z ∂ y ∂ 2 A ) i ^ + ( ∂ z ∂ x ∂ 2 A − ∂ x ∂ z ∂ 2 A ) j ^ + ( ∂ x ∂ y ∂ 2 A − ∂ y ∂ x ∂ 2 A ) k ^
Assuming that Schwarz Theorem or Young's Theorem is applicable or by some other conditions The symmetry of the 2nd order mixed partial derivatives exist we can say that ∂ x ∂ y ∂ 2 A = ∂ y ∂ x ∂ 2 A and so on.
Which yields the result c u r l ( g r a d ( A ) ) = 0 .
It is vital that the symmetry exists. Otherwise the result is not true.
Here the operator in the question is c u r l ( g r a d ( . ) ) and not anything else. And the ∇ ⋅ ∇ in the question is nothing but d i v ( g r a d ( . ) ) and the dot between them js just a notation. Nothing is getting multiplied scalorly here. What we are doing is first applying gradient operation and then finding the divergence of the gradient of the function. Symbolically it is represented as ∇ ⋅ ∇ .