It's not hard to show that the function:
g = \frac{1}{2} (c \times r)
is a "vector potential" function for the constant vector "c". That is, that:
\nabla \times g = c
The calculation is straightforward to carry out in Cartesian coordinates, and I won't reproduce it here.
However...
The Basel Problem is a well known result in analysis which basically states:
\frac{1}{1^2} + \frac{1}{2^2} + \frac{1}{3^2} + \frac{1}{4^2} + ... = \frac{\pi^2}{6}
There are various well-known ways to prove this.
I was wondering if there is a similar, simple way to calculate the value of the...
I'm reading from Wikipedia:
I thought linear operators always had eigenvalues, since you could always form a characteristic equation for the corresponding matrix and solve it?
Is that not the case? Are there linear operators that don't have eigenvalues?
Homework Statement
I'm trying to show that every affine function f can be expressed as:
f(x) = Ax + b
where b is a constant vector, and A a linear transformation.
Here an "affine" function is one defined as possessing the property:
f(\alpha x + \beta y) = \alpha f(x) + \beta f(y)...
I'm reading from an introductory text on quantum physics, and came across this sentence:
It's the second sentence that I don't understand: how can the energy in the EM field be responsible for the ability of a hollow cavity to absorb heat?
Making a change of basis in the matrix representation of a linear operator will not change the eigenvalues of that linear operator, but could making such a change of basis affect the geometric multiplicities of those eigenvalues?
I'm thinking that the answer is "no", it cannot..
Since if...
Is an ideal always a linear space?
I'm reading a proof, where the author is essentially saying: (1) since x is in the ideal I, and (2) since y is in the ideal I; then clearly x-y is in the ideal I.
In other words, if we have two elements belonging to the same ideal, is their linear...
Is there such a thing as a square matrix with no eigenvectors?
I'm thinking not ... since even if you have:
\left[\begin{array}{cc} 0 & 0 \\ 0 & 0 \end{array}\right]
you could just as well say that the eigenvalue(s) are 0 (w/ algebraic multiplicity 2) and the eigenvectors are:
u_1 =...
Which schools have the best programs in computational astrophysics?
Judging by their websites, Princeton and Univ. of Chicago seem to have strong programs, but I was wondering if people knew of others as well??
Suppose we have a linear transformation/matrix A, which has multiple left inverses B1, B2, etc., such that, e,g,:
B_1 \cdot A = I
Can we conclude from this (i.e., from the fact that A has multiple left inverses) that A has no right inverse?
If so, why is this?
Homework Statement
Is the set of all continuous functions (defined on say, the interval (a,b) of the real line) a vector space?
Homework Equations
None.
The Attempt at a Solution
I'm inclined to say "yes", since if I have two continuous functions, say, f and g, then their sum f+g...
Are Lattice Gauge Theories still considered an area of active physics research? (i.e., are people still producing PhDs in this subject?) Or has this research area become passe?
I just want to test/verify my knowledge of change of basis in a linear operator.. (it's not a homework question).
Suppose I have linear operator mapping R^2 into R^2, and expressed in the canonical basis (1,0), (0,1). Suppose (for the sake of discussion) that the linear operator is given by...
Suppose I have a linear operator of dimension n, and suppose that this operator has a non-trivial null space. That is:
A \cdot x = 0
Suppose the dimension of the null space is k < n, that is, I can find 0 < k linearly independent vectors, each of which yields the 0 vector when the linear...
Homework Statement
I'm working on a problem involving hyperplanes and factor spaces. It involves a bit of setup. I'll describe first the definitions. Suppose you have a vector space K, of dimension n. Suppose you have a linear subspace L of K. Choose a vector x_0 \in K, then the hyperplane H...
Homework Statement
Given a set of polynomials in x:
x^{r_1}, x^{r_2},...,x^{r_n}
where r_i \neq r_j for all i \neq j (in other words, the powers are distinct), where the functions are defined on an interval (a,b) where 0 < a < x < b (specifically, x \neq 0), I'd like to show that this...
I understand that the massive object at the center of the galaxy is visible in both radio waves and X-rays, although not in visible light.
I also understand that an image has recently been taken of this object in the infrared wavelengths.
Have there been any attempts to do spectroscopy on...
Is it true in general that:
|\int f(x)dx| < \int |f(x)|dx
Not sure if "Triangle Inequality" is the right word for that, but that seems to be what's involved.
Homework Statement
From the relation:
A(x^2+y^2) -2Bxy + C =0
derive the differential equation:
\frac{dx}{\sqrt{x^2-c^2}} + \frac{dy}{\sqrt{y^2-c^2}} = 0
where c^2 = AC(B^2-A^2)
The Attempt at a Solution
I'm able to (more or less) do the derivation, but I think the correct...
Homework Statement
Make the following change of variables:
x = r \cos \theta
y = r \sin \theta
and integrate the following equation:
(xy'-y)^2 = a(1+y'^2)\sqrt{x^2+y^2}
The Attempt at a Solution
First it's worth noting that the equation x^2+y^2=a^2 (even without changing...
Homework Statement
Solve the following equation:
y^2- xy + (x+1)y' = 0
The Attempt at a Solution
The equation isn't exact, and it isn't homogeneous.
I've tried a range of different substitutions, including v = y - x, v = y^2, v = y^2 - xy, none of which seem to lead down a fruitful...
Homework Statement
I'm trying to integrate the following form:
y = \int e^{\sin x} dx
The Attempt at a Solution
I thought about trying to write something like:
y = \int e^{\frac{i}{2}e^{-ix} - \frac{i}{2}e^{ix}} dx
But this seems to lead down the road of trying to integrate...
Is there a simple way to show that when we differentiate the following expression (call this equation 1):
Y(x) = \frac{1}{n!} \int_0^x (x-t)^n f(t)dt
that we will get the following expression (call this equation 2):
Y'(x) = \frac{1}{(n-1)!}\int_0^x (x-t)^{n-1}f(t)dt
It's simple...
I'm reading Ince on ODEs, and I'm in the section (in Chapter 5) where he talks about the Wronskian. There are quite a few things here that I don't quite understand or follow.
I'm not going to get into all the details, but briefly, suppose we have the Wronskian of k functions:
W =...
Suppose we have the differential equation:
\frac{dx}{\sqrt{1-x^2}} + \frac{dy}{\sqrt{1-y^2}} = 0
It can be rewritten as:
\sqrt{1-y^2}dx + \sqrt{1-x^2}dy = 0
One solution of this equation (besides arcsin(x) + arcsin(y) = C), is given by:
x\sqrt{1-y^2} + y\sqrt{1-x^2} = C...
Just so I have the concept of a singular solution down correctly, suppose I have an equation like:
\left(x+y\right)^2y' = 0
This admits of two solutions:
y=-x
and, from:
y' = 0
y = C
where C is a constant.
So the "two" solutions for the equation would be:
y_1=-x, y_2 =...
Homework Statement
I'm working on Problem 8, Chapter 1 in the classic Ince text on ODEs.
The question has me somewhat confused.
It involves Euler's Theorem on Homogeneous functions. For reference, this theorem states that if you have a function f in two variables (x,y) and homogeneous in...
Is there such a thing as a homogenous function of degree n < 0?
Considering functions of two variables, the expression:
f(x,y) = \frac{y}{x}
is homogeneous in degree 0, since:
f(tx,ty) = \frac{ty}{tx} = \frac{y}{x} = f(x,y) = t^0 \cdot f(x,y)
and the expression:
f(x,y) = x...