Custom Search
More than 12,000 searchable pages indexed.

Your RF Cafe
Progenitor & Webmaster

Click here to read about RF CafeView the YouTube RF Cafe Intro VideoKirt Blattenberger ... single-handedly redefining what an engineering website should be.

Carpe Diem!
(Seize the Day!)

5th MOB:
My USAF radar shop

Airplanes and Rockets:
My personal hobby website

Equine Kingdom:
My daughter Sally's horse riding website

•−•  ••−•    −•−•  •−  ••−•  •
RF Cafe Morse Code >Hear It<

Job Board

About RF Cafe™


Receiver sensitivity with temperature - RF Cafe Forums

The original RF Cafe Forums were shut down in late 2012 due to maintenance issues. Please visit the new and improved RF Cafe Forums that were created in September of 2015. Unlike with the old forums where users registered individually, the new forums use a common User Name and Password so anyone can post without needing to create an account. Please find the current User Name and Password on the RF Cafe homepage. Thanks for your participation.

Below are all of the old forum threads, including all the responses to the original posts.

-- Amateur Radio
-- Anecdotes, Gripes & Humor
-- Antennas
-- CAE, CAD, & Software
-- Circuits & Components
-- Employment & Interviews
-- Miscellany
-- Swap Shop
-- Systems
-- Test & Measurement
-- Webmaster

 Post subject: Receiver sensitivity with temperature
Posted: Fri Nov 16, 2007 6:38 am 

Joined: Fri Nov 16, 2007 5:06 am
Posts: 1

We have a debate in my company on how to calculate the receiver sensitivity with temperature.
Sensi = kTB + NF + C/N
C/N is demodulator requirement, and does not vary with temperature for a digital implementation
NF is the noise figure, and is referenced to 290K according IEEE standard (Friis proposed this temperature in 1944). NF varies with temperature of course.
My point is that, when you compute the sensitivity at different temperature, the "T" of kTB should remain equal to 290K, otherwise you account twice for the thermal noise change, once in the NF, and once in kTB.
What is your opinion on that?

Cheers, Patrikc

 Post subject:
Posted: Fri Nov 16, 2007 2:42 pm 
User avatar

Joined: Tue Jun 26, 2007 10:27 am
Posts: 21
Location: Dallas, TX
Hi Patrikc,

Here is my take on the issue at hand.

Looking at the equation


we must first note that the bandwidth and the required C/N at the input of the demodulator usually do not change with temperature. Therefore, the only two terms which are in question are the NF and Pin_mds(or sensitivity). The NF is defined as the SNR(dB) at the input of the system minus the SNR(dB) at the output of the system. The NF may also be derived from a total integrated input referred voltage noise relative to the voltage noise produced by the source resistance (typically 50 Ohm). In the latter case, the integrated input referred noise does change with temperature, but the reference voltage noise is still calculated using the 290K number (~0.895nV/sqrt(Hz) for 50 Ohm @290K). Therefore, the NF does increase with temperature.

The term kT may be assumed only if the input impedance of the system is matched to the source resistance. This term should also be treated as a constant much like the noise of the source resistance. Then, the input referred voltage noise of the system will account for the noise floor variation inherently because of the kT noise associated with it.

In short, you are correct in your statement.


Posted  11/12/2012