Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Fixed Wireless - health issues

  • 18-05-2006 11:09am
    #1
    Registered Users, Registered Users 2 Posts: 232 ✭✭


    Hi,

    I'm after reading a thread below on clearwire, and I have a similar question on fixed wireless in general.

    I have the fixed wireless transmitter/receiver installed just outside the window of my 2 year old child's bedroom. He sleeps right beside the window. Broadband is left on 24/7 (as I use a VOIP ATA).

    I presume the transmission from my side is one-directional (hence the line of sight requirement) but I wonder is there much of my transmitted signal sent back towards my house (as my transmission signal surely must be much stronger than the degraded signal that reaches me from the basestation).

    Maybe I should just get it re-installed somewhere else.
    Anyone know where I could find more info on this.

    Any help is much appreciated.


Comments

  • Registered Users, Registered Users 2 Posts: 1,660 ✭✭✭crawler


    If you look at the maximum power output allowed by comreg for FWA operators you will realise just how little power the transmitters can actually use...

    A good compasison is to compare a FWA maximum output to a GSM cell maximum output :)


  • Registered Users, Registered Users 2 Posts: 232 ✭✭fisab


    Many thanks for the reply Crawler.

    Are you saying that a mobile phone on the window sill transmits a much higher signal to its base station, than my FTA transmitter/receiver transmits to the high-site?


  • Registered Users, Registered Users 2 Posts: 1,660 ✭✭✭crawler


    good question...

    In GSM world - the further you are from a base station the harder the phone has to work i.e. the more power output it has. when close to a base station it lowers power to conserve battery.

    So ironically if the GSM transmitter is far away from you the yolk you hold to your HEAD is outputting more power.

    This is when it becomes really ironic - people unhappy with mast applications are sometimes doing themselves an injustice by stopping the planning as they end up with poor local service and phones working flat out.

    On the flip side the GSM transmitters have a higher output than the phone (naturally) so what is best???

    There is the even ongoing debate on the safety of mobile phones...

    In either case with FWA broadband there is probably more power output from your little toe!! :)

    All of this is a bit of a generalisation on my part as there are a large number of considerations (frequency being used, type of entenna, beam forming, path loss, cell density etc etc etc...)

    The ITU-T specify maximum safe power levels of for all electronic devices and to my knowledge all the mobile operators stay well within these limits - Comreg also regulary tests random sites for all operaotras and I also believe all tests have been fine.

    People will ( I GUARANTEE) have their own opinions anyway.....


  • Registered Users, Registered Users 2 Posts: 240 ✭✭Manchegan


    Wireless Local Area Networks (WLANs)

    The HPA does not consider there to be a problem with the safety of WLAN. If an explicit statement that exposures are within the ICNIRP guidelines is required, this would have to be obtained from the manufacturers; however, it could be argued that this is implicit in the CE marking.


  • Technology & Internet Moderators Posts: 28,811 Mod ✭✭✭✭oscarBravo


    fisab wrote:
    Are you saying that a mobile phone on the window sill transmits a much higher signal to its base station, than my FTA transmitter/receiver transmits to the high-site?
    With the crap Voda coverage in my house, I can categorically guarantee that my mobile is putting out much higher power levels than any FWA gear.

    Let me put it like this: a light bulb emits (say) 60-100W of electromagnetic radiation. FWA equipment operating in licence-exempt spectrum emits 0.1W of electromagnetic radiation, focused towards the base station.

    As a colleague of mine once put it: "the only way a FWA antenna could harm you is if I hit you on the head with it." :)


  • Advertisement
  • Technology & Internet Moderators Posts: 28,811 Mod ✭✭✭✭oscarBravo


    crawler wrote:
    A good compasison is to compare a FWA maximum output to a GSM cell maximum output :)
    ...and, to put that in context, the exposure to RF energy while standing at the perimeter fence of a GSM base station is generally in or around 10,000 times lower than the accepted safe level - which is in itself conservative.


  • Registered Users, Registered Users 2 Posts: 232 ✭✭fisab


    oscarBravo wrote:
    <snip>
    As a colleague of mine once put it: "the only way a FWA antenna could harm you is if I hit you on the head with it." :)

    LOL!!!

    Thanks guys for all the input.


  • Closed Accounts Posts: 1,467 ✭✭✭bushy...


    oscarBravo wrote:
    FWA equipment operating in licence-exempt spectrum emits 0.1W of electromagnetic radiation, focused towards the base station.
    ...... yip,meanwhile your cuddly ripwave modem from IBB can manage 10 times that (http://www.tuanistechnology.com/products/navini/data.pdf)


  • Registered Users, Registered Users 2 Posts: 856 ✭✭✭andrew163


    At 2.4GHz it can only legally put out 0.1W in this country.
    Not sure about 3.5 (do IBB even use their 3.5GHz spectrum for ripwave here?)..


  • Closed Accounts Posts: 5 Nio


    I would not be happy living near an IBB base station, given the previous reputation of the company for giving a sh*t about anything, really.

    People don't realise that commercial, "non-line of sight" FWA base stations are generating much higher outputs that mobile masts. At 3.5Ghz, it takes a lot more energy to get non-line of sight communications than at mobile frequencies. FWA manufacturers were trying to give the impression that non-LOS performance was due to improvements in modulation like OFDM etc., however it seems likely that the net increase in signal strength due to techniques like beam-forming plays a bigger role.

    This is not good news for the future of wireless comms.
    We should not be allowing a situation to develop where there are a (relatively few) number of high powered stations, providing both a crappy service and exposing local residents to radio energy frequencies that are far less tested than mobile services, yet at the same time, a lot more powerful.


  • Advertisement
  • Technology & Internet Moderators Posts: 28,811 Mod ✭✭✭✭oscarBravo


    Nio wrote:
    People don't realise that commercial, "non-line of sight" FWA base stations are generating much higher outputs that mobile masts.
    I'm anything but an IBB fan, but I'm not prepared to let that go unchallenged.
    Nio wrote:
    At 3.5Ghz, it takes a lot more energy to get non-line of sight communications than at mobile frequencies.
    Maybe - but experience seems to suggest that NLOS works a hell of a lot better at mobile frequencies than at 3.5GHz. Millions of people are using mobile phone in NLOS situations every day, whereas NLOS broadband providers seem to be having patchy results at best.
    Nio wrote:
    FWA manufacturers were trying to give the impression that non-LOS performance was due to improvements in modulation like OFDM etc., however it seems likely that the net increase in signal strength due to techniques like beam-forming plays a bigger role.
    Beam-forming doesn't increase signal strength. If a base station sector has a total output power of (say) 2W, then any technique to increase gain - whether it's beam-forming, physical antenna gain or anything else - can't increase the total output power past 2W. And that's assuming that the beam-forming technology is remotely efficient.
    Nio wrote:
    We should not be allowing a situation to develop where there are a (relatively few) number of high powered stations, providing both a crappy service and exposing local residents to radio energy frequencies that are far less tested than mobile services, yet at the same time, a lot more powerful.
    "A lot more powerful" assumes that the sum of the base station transmit power plus antenna gain is greater than the same sum for a mobile phone base station. Unless you can demonstrate that this is the case (and we're talking about serious transmit power, and serious antenna gain in the case of mobile phone technology), then I call bullshít.


  • Registered Users, Registered Users 2 Posts: 32,417 ✭✭✭✭watty


    Only below 1GHz gives decent NLOS, to a lesser extent 2GHz. 3.5GHz only slightly NLOS.


    Even "high power" 3.5GHz and 5GHz is tiny compared with TV Transposer / Relay which can be 1000th the power or 10,000th the power of a main TV station.

    As the frequency rises from 50MHz VHF to 864MHz UHZ the NLOS performance decreases. Above 800MHz (0.8GHz) the NLOS performance degrades rapidly so that by 10GHz it is nearly only LOS. (Some path through trees is possible and some beam bending on hill tops occurs, but not usuable).

    Mobile phones use 800 / 900 / 1800 / 1900MHz (0.8GHz to 1.9GHz) depending on technology, country etc. (Hence dual, tri and Quad band).

    The Mobile phone mast may output 10 to 250 times the power of a mobile... BUT!

    The mast emits a cone pattern of radiation if you slice it from side. Almost no power within 20m of mast.

    Inverse square law applies.
    If a phone is putting power into your skull from 5cm away, then at 50m this is 1000 times distance. so power is 1 000 000 less. i.e. One Millionth.

    Multiply by even 1000 for REALLY high power base (Though 100 is generous) and STILL the base power is ONE THOUSAND TIMES LESS than the phone handset.


  • Closed Accounts Posts: 1,467 ✭✭✭bushy...


    I can understand your worry about living near an IBB base , but I think you are trying to run before you walk ..... http://boards.ie/vbulletin/showthread.php?t=2054892221 .
    A better way to pass your afternoon since you had the gear would have been to measure near your head while making a call, then measure about outdoors, it may help you understand what Watty was saying ....turn off the calibrate bit first tho this time (sorry couldnt resist )


  • Closed Accounts Posts: 5 Nio


    oscarBravo wrote:
    Maybe - but experience seems to suggest that NLOS works a hell of a lot better at mobile frequencies than at 3.5GHz.
    watty wrote:
    Only below 1GHz gives decent NLOS, to a lesser extent 2GHz. 3.5GHz only slightly NLOS.
    This is my point. 3.5Ghz NLOS requires more energy in a given beam.
    watty wrote:
    Even "high power" 3.5GHz and 5GHz is tiny compared with TV Transposer / Relay which can be 1000th the power or 10,000th the power of a main TV station.
    True, but not comparing like with like. Firstly, TV sites are less likely to be set within the type of high-density population areas that mobile/FWA masts are, which greatly decreases the effective received power. Secondly, we can't assume that just because x power at some frequency and modulation is ok, that this will necessarily be the case with the same power at different frequencies. It's a big leap of faith to assume that all frequencies have similar (potential) biological interactions, clearly this is not the case but is often put forward as being the case.
    oscarBravo wrote:
    Beam-forming doesn't increase signal strength. If a base station sector has a total output power of (say) 2W, then any technique to increase gain - whether it's beam-forming, physical antenna gain or anything else - can't increase the total output power past 2W.
    Of course not, the point is that it focuses it in a narrow beam that increases effective signal strength at the client end, compared to a sector antenna for example.
    oscarBravo wrote:
    And that's assuming that the beam-forming technology is remotely efficient.

    Ofcom 3.4Ghz evaluation including notes on WiMax type beamforming, gives typical gain of 18db -
    http://www.ofcom.org.uk/research/technology/archive/cet/uwb/background_uwb_rpt/tech_evaluation
    "WiMax TX channel eirp = 59.5 dBm"

    Sample mobile base station values from UK Health Protection Agency report -
    http://www.hpa.org.uk/radiation/publications/w_series_reports/2002/nrpb_w23.pdf
    "The O2 antennas are quoted as operating at an EIRP of 51dBm per signal for the GSM 900 band and 49 dBm for the GSM 1800 band"
    oscarBravo wrote:
    "A lot more powerful" assumes that the sum of the base station transmit power plus antenna gain is greater than the same sum for a mobile phone base station. Unless you can demonstrate that this is the case (and we're talking about serious transmit power, and serious antenna gain in the case of mobile phone technology), then I call bullshít.
    It has to be, to achieve NLOS coverage at that frequency. Please tell me why it doesn't, given that we agree that @ 3.5ghz greater energy is dissipated?

    Bushy - very smart :)
    It goes without saying that emissions from a mobile up against your head are much greater than those you are likely to encounter from any TX site, however there is a key difference in that mobiles are only used intermittently, whereas the carrier signals from base stations are constantly on.


  • Closed Accounts Posts: 1,467 ✭✭✭bushy...


    The only bit i was slagging about was the calbrate part.....anyway what I meant was , go out into say ..your garden with your gear and get a field strength reading there from a mobile mast.
    Mobiles "chat " to the mast every so often to tell them " i'm alive " etc too , also the nearer you are to the base the less power they output.


Advertisement