Small Network by BlackBerry Date Package and Tethering

Hello BBers !
I Tether my BlackBerry Data package to my laptop, of course using BlackBerry Desktop Software.
Now, I need to know if there is anyway/device can help me to use this internet to connect other Wi-Fi devices such as iPod Touch, Tablet PCs and etc..
I tried using Ad-Hoc but doesn’t connect to the iPod even after the iPod could find the connection and show me the tick, still cant open any webpage or Apple Store  
I use Windows 7, I have BlackBerry Bold 9780
Mub

Currently, no BlackBerry device can be used at a Wi-Fi hotspot.  There are rumors that it's upcoming in an OS 7 or BBX update, but those are just rumors.
I hope that helps you. 
- If my response has helped you, please click "Options" beside my post and mark it as solved. Clicking the "thumbs up" icon near the bottom of my response would also be appreciated.

Similar Messages

  • Problems with the O2 blackberry data package on my Curve 3G.

    I have already informed O2 about this but they claim that I should be used the blackberry support services, but nothing there helps me!
    I got my Blackberry Curve 3G on September 9th this year and I put on the Blackberry Data Package bolt-on onto my phone on September 16th. I then received a text to say they've taken £5 from my credit and it will be up and running in the next 24 hours. Its now September 19th and my BBM is not working at all and I am extremely upset with the services and behaviour I have received from both O2 and Blackberry.
    Is there any way you can help? If this fails, I shall be forced to go back to the shop from where I got my Blackberry from and ask for their help.
    Many thanks, Jade.

    Can a bubble whistle Problems with the O2 blackberry data package on my Curve 3G.? The seat matures in your oar. The lad ices the pursuing method inside a resident. A judge spins against the vendor! The rose wows the hello. 
    filipina heart

  • Data package and data packet

    Hit
    i want to know the difference between data package and data packet .when this comes in sap bw
    with regards
    tushar

    Hello,
    Data package term is related to DTP which is used to load Data from PSA to further Data Targets
    Start and end routine works at package level so routine run for each package one by one .By default package have sorted data based on keys (non unique keys (characteristics )of source or target) and by setting semantic keys you can change this order.So Package having more data will take more time in processing then package have lesser data .
    Data Packet Term is related to Info Package which is used to load data from Source System to BI (PSA).
    As per SAP standard, we prefer to have 50,000 records per one data packet.
    For every data packet, it does commit & save --- so less no. of data packets required.
    If you have 1 lakh records per data packet and there is an error in the last record, the entire packet gets failed.
    Hope it helps!

  • Small network to centralise data in my suites

    What do those who know of these things , think about the device (see Link) that I am considering as a way of centralising/archiving data that I need in up to 4 workstations (never simultaneously) and never as part of direct editing process. ie centralised storage of files for ease of access by 4 locations and file sharing.
    I am considerng loading it with two 1 Tb drives formatted for Mac to permit cross platform workflows as required.
    http://www.netgear.com/Products/Storage/NetworkStorage/SC101T.aspx
    I have two active suites and then there are two offices used by freelance artists or clients. All have access to the (ethernet / LAN)

    Rudin,
    Please refer to the follwoing KnowledgeBases:
    -http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/5400ae61f71f1cc486256655005cb1c6?OpenDocument
    -http://pong.ni.com/public.nsf/websearch/FAB45BB4307AE063862565A70062ABA9?OpenDocument
    Regards,
    Cyril Bouton
    Applications Engineer
    National Instruments
    Cyril Bouton
    Active LabVIEW Developper

  • Data package/container size

    We are using MI and I have a little question about data packages sizes and container sizes.
    If I use data packaging and the size very small, I read at help.sap.com that MI is sending 20 containers.
    But how much data is sent by MI. What is the size of 1 container?
    I ask this question because we have a little problem with the sizes of the data packages/containers.
    So the major question is: What is the size of a container which is sent from MI to the Middleware (SAP WAS)?
    How can I see the data I sent from MI to the middleware (SAP WAS) and how can I see if the packages are full ore not.
    I hope someone can help me. Thanks in advance.
    Kind regards,
    Bart Elshout

    Hi Thomas,
    Do you know anything about a "Fragment Bit" which is set with sending data from SAP MI client to SAP WAS (middleware)?
    The data is sent with a "Don't fragment bit" and one of our switches/routers or something in the network has to fragment our data, but the bit is set. So the MI client receives an error message (Unexpected end of file from server).
    We want to set something in the SAP MI client which sais this datapacket may be fragmented so it can pass the network switches/routers/etc.
    Can someone help us with this problem?
    Thanks in advance and kind regards,
    Bart Elshout

  • DTP does not fetch all records from Source, fetches only records in First Data Package.

    Fellas,
    I have a scenario in my BW system, where I pull data from a source using a Direct Access DTP. (Does not extract from PSA, extracts from Source)
    The Source is a table from the Oracle DB and using a datasource and a Direct Access DTP, I pull data from this table into my BW Infocube.
    The DTP's package size has been set to 100,000 and whenever this load is triggered, a lot of data records from the source table are fetched in various Data packages. This has been working fine and works fine now as well.
    But, very rarely, the DTP fetches 100,000 records in the first data package and fails to pull the remaining data records from source.
    It ends, with this message "No more data records found" even though we have records waiting to be pulled. This DTP in the process chain does not even fail and continues to the next step with a "Green" Status.
    Have you faced a similar situation in any of your systems?  What is the cause?  How can this be fixed?
    Thanks in advance for your help.
    Cheers
    Shiva

    Hello Raman & KV,
    Thanks for your Suggestions.
    Unfortunately, I would not be able to implement any of your suggestions because, I m not allowed to change the DTP Settings.
    So, I m working on finding the root cause of this issue and came across a SAP Note - 1506944 - Only one package is always extracted during direct access , which says this is a Program Error.
    Hence, i m checking more with SAP on this and will share their insights once i hear back from them.
    Cheers
    Shiva

  • Disk throughput drops when inserting data packages in write-optimized DSO

    Hi all,
    we are currently testing our new freshly installed SAN.
    To see the performance gain in BI, I'm currently doing some test loads.
    And during the monitoring of those loads, I noticed something I'd like someone to explain :-):
    I execute a DTP from PSA to a write-optimized DSO.
    The n° of parallel processes = 9
    Update method = serial extraction, immediate parallel processing
    N° of records transferred: +23.000.000
    Ok, in the first phase (read the PSA) only one process is used (serial extraction).  When I look in OS07, I notice we have very good throughput: +66.000 TransfKB/s. Very nice!
    But as soon as BI starts inserting the data packages, and parallel processing kicks in, the throughput drops to 4K or something, and sometimes we get 20K at max.  That's not too good.
    We have a massive SAN , but the BI system does not seem to use it?
    I was wondering why this is the case.  I already toyed around with the package size, but it's always the same.
    Also I noticed that the allocated processes don't seem to be active.  I allocated 9 BTC processes to this load.
    They are all used, but we only see 3 inserts at the same time, max.  Also in the DTP-monitor, only 3 packages are processed at the same time.  As it's a write-optimized DSO, RSODSO_SETTINGS does not apply I presume.
    Any ideas?
    tnx!

    Hi,
    can you pls try to give some filetr in DTP and try to pull the data.
    I am not sure why first data package is taking long time and otehr data package is taking less time..
    Do you have any start routine..If datapak = 1.. the do this logic..
    Pls check..
    regards
    Gopal

  • Small Network Trouble

    I am currently trying to setup a small network at my Mothers work and I have ran into a small problem. Here is the situation: I have ten iMac G4 that I have networked together using a NetGear switch. I now need to get internet to these ten computers. The hard part about this is I need the switch to get the internet from an AirPort Extreme in here office and the switch is located down the hallway in another room. I don't have the option of running a sing CAT5 cable from the Base Station to the switch it needs to be done wirelessly. I will include a link to twitpic (http://twitpic.com/1x1xsz) where I have uploaded a diagram that I drew to give you a visualization of my problem.
    Any and all suggestions are welcome.
    Thanks

    Welcome to the discussions!
    You might want to explore all the options.
    You could use wireless and setup an AirPort Express to "join" the wireless network that the Airport Extreme creates, then enable the ethernet port on the device to feed the switch.
    While this should work most of the time, you would be trusting an entire network to a single wireless connection. I would think long and hard about this and use this only if there were no other alternatives.
    You could also use a pair of ethernet powerline adapters to transmit the ethernet signal over the AC powerline. Just run an ethernet cable from the AirPort Extreme to an adapter and plug it in the wall. Plug the other adapter in the wall and then run and ethernet cable from the adapter to the switch.
    Works great. Faster and more reliable than wireless by far. Newer powerline equipment is almost up to Gigabit speeds...almost as good as running an ethernet cable.
    Just make sure that everything is on the same master electrical circuit.
    Whichever you decide upon, it would also be a good idea to have a full understanding of the store's return policy in case things don't work out as hoped. Sometimes, it really is worth the extra expense to have the security of a traditional ethernet connection.

  • Issue in Update routine due to Data Package

    We have this peculiar situation.
    The scenario is ..
    We have to load data from ODS1 to ODS2.
    The data package size is 9980 while transferring data from ODS1 to ODS2.
    In the update rule we have some calculations and we rank the records based on these calculations.
    The ODS key for both ODS1 and ODS2 is same ie Delivery Number , Delivery Item & Source System.
    For example a Delivery Number has 12 Delivery Items.
    These Delivery Items are in different Data Packages namely Data Package 1 and Data Package 4.
    So instead of having the ranks as 1 to 10 its calculating it as 1 to 5 and second item as 1 to 5.
    But what we require is Rank as 1 to 10.
    This is due to the fact that the items are in different Data packages.
    In this case the ABAP routine is working fine but the Data Package is the problem.
    Can anybody any alternative solution to this issue.?
    Thanks in advance for assistance.............

    CODE FOR INTER DATA PACKAGE TREATMENT
    PROGRAM UPDATE_ROUTINE.
    $$ begin of global - insert your declaration only below this line  -
    TABLES: ...
    DATA:   ...
    DATA: v_packet_nbr TYPE i VALUE 1.
    DATA:
      g_requnr  TYPE rsrequnr.
    DATA:
      l_is        TYPE string VALUE 'G_S_IS-RECNO',
      l_requnr    TYPE string VALUE 'G_S_MINFO-REQUNR'.
    FIELD-SYMBOLS: <g_f1> TYPE ANY,
                   <g_requnr> TYPE ANY.
    TYPES:
      BEGIN OF global_data_package.
            INCLUDE STRUCTURE /bic/cs8ydbim001.
    TYPES: recno   LIKE sy-tabix,
      END OF global_data_package.
    DATA lt_data_package_collect TYPE STANDARD TABLE OF global_data_package.
    DATA ls_datapack TYPE global_data_package.
    datapackage enhancement Declaration
    TYPES: BEGIN OF datapak.
            INCLUDE STRUCTURE /bic/cs8ydbim001.
    TYPES: END OF datapak.
    DATA: datapak1 TYPE STANDARD TABLE OF datapak,
          wa_datapak1 LIKE LINE OF datapak1.
    Declaration for Business Rules implementation
    TYPES : BEGIN OF ty_ydbsdppx.
            INCLUDE STRUCTURE /bic/aydbsdppx00.
    TYPES: END OF ty_ydbsdppx.
    DATA : it_ydbsdppx TYPE STANDARD TABLE OF ty_ydbsdppx WITH HEADER LINE,
           wa_ydbsdppx TYPE ty_ydbsdppx,
           temp TYPE /bic/aydbim00100-price,
           lv_tabix TYPE sy-tabix.
    $$ end of global - insert your declaration only before this line   -
    The follow definition is new in the BW3.x
    TYPES:
      BEGIN OF DATA_PACKAGE_STRUCTURE.
         INCLUDE STRUCTURE /BIC/CS8YDBIM001.
    TYPES:
         RECNO   LIKE sy-tabix,
      END OF DATA_PACKAGE_STRUCTURE.
    DATA:
      DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
           WITH HEADER LINE
           WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    FORM startup
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
               MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
               DATA_PACKAGE STRUCTURE DATA_PACKAGE
      USING    RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
    to make monitor entries
    TABLES: rsmonfact.
      TYPES:
        BEGIN OF ls_rsmonfact,
          dp_nr TYPE rsmonfact-dp_nr,
        END OF ls_rsmonfact.
      DATA: k TYPE i,
            v_lines_1 TYPE i,
            v_lines_2 TYPE i,
            v_packet_max TYPE i.
    declaration of internal tables
      DATA: it_rsmonfact TYPE STANDARD TABLE OF ls_rsmonfact.
    INTER-PACKAGE COLLECTION TREATMENT *******************
      ASSIGN (l_requnr) TO <g_requnr>.
      SELECT dp_nr FROM rsmonfact
        INTO TABLE it_rsmonfact
        WHERE rnr = <g_requnr>.
      DESCRIBE TABLE it_rsmonfact LINES v_packet_max.
      IF v_packet_nbr < v_packet_max.
      APPEND LINES OF DATA_PACKAGE[] TO lt_data_package_collect[].
        CLEAR: DATA_PACKAGE.
        REFRESH DATA_PACKAGE.
        v_packet_nbr = v_packet_nbr + 1.
        CLEAR: MONITOR[], MONITOR.
        MONITOR-msgid = '00'.
        MONITOR-msgty = 'I'.
        MONITOR-msgno = '398'.
        MONITOR-msgv1 = 'All data_packages have been gathered in one. '.
        MONITOR-msgv2 = 'The last DATA_PACKAGE contains all records.'.
        APPEND MONITOR.
      ELSE.
    last data_package => perform Business Rules.
        IF v_packet_max > 1.
          APPEND LINES OF DATA_PACKAGE[] TO lt_data_package_collect[].
          CLEAR: DATA_PACKAGE[], DATA_PACKAGE.
          k = 1.
    We put back all package collected into data_package, handling recno.
          LOOP AT lt_data_package_collect INTO ls_datapack.
            ls_datapack-recno = k.
            APPEND ls_datapack TO DATA_PACKAGE.
            k = k + 1.
          ENDLOOP.
          CLEAR : lt_data_package_collect.
          REFRESH : lt_data_package_collect.
        ENDIF.
    sorting global data package and only keep the first occurence of the
    *record
      SORT DATA_PACKAGE BY material plant calmonth.
      DELETE ADJACENT DUPLICATES FROM DATA_PACKAGE
            COMPARING material plant calyear.
      SELECT * FROM /bic/aydbsdppx00
          INTO TABLE it_ydbsdppx
          FOR ALL ENTRIES IN DATA_PACKAGE
            WHERE material = DATA_PACKAGE-material
              AND plant    = DATA_PACKAGE-plant
              AND calyear  = DATA_PACKAGE-calyear.
    Enhance Data_package with Target additionnal fields.
      LOOP AT DATA_PACKAGE.
        CLEAR : wa_datapak1, wa_ydbsdppx.
        MOVE-CORRESPONDING DATA_PACKAGE TO wa_datapak1.
        READ TABLE it_ydbsdppx INTO wa_ydbsdppx
          WITH KEY material = DATA_PACKAGE-material
                      plant = DATA_PACKAGE-plant
                    calyear = DATA_PACKAGE-calyear.
        IF sy-subrc NE 0.       "new product price
          APPEND wa_datapak1 TO datapak1.
        ELSE.                   " a product price already exists
          IF wa_ydbsdppx-calmonth GE DATA_PACKAGE-calmonth.
    keep the eldest one  (for each year), or overwrite price if same month
            APPEND wa_datapak1 TO datapak1.
          ENDIF.
        ENDIF.
      ENDLOOP.
    ENDIF.
    if abort is not equal zero, the update process will be canceled
      ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    ENDFORM.
    Edited by: mansi dandavate on Jun 17, 2010 12:32 PM

  • Every 3rd data package taking long time for execution

    Hi Everyone
    We are facing a strange situation. Our scenario involves doing a full load from DSO to CUBE.
    Start routines are not very database intensive and care has been taken to write them in a optimized way.
    But strangely every 3rd data package is taking exceptionally longer time than other data packages.
    a) DTP is having 3 parallal processes.
    b)time spent in extraction , rule, and updation is constant for every data package.
    c)start routine time is larger for every 3rd data package and keeps on increasing. for e.g. 5 mins, 10 mins, 24 mins, 33 mins etc it increases by each 3rd package.
    I tried to anlayze the data which was taking so much time but found no difference in terms of data in normal and longer time taking DTP (i.e. there was not logical difference in data for start routine to behave like this).
    I was wondering what can be the possible reasons for it and may be some other external system factors can be responsible for it. If someone can help in this regard that will be highly appreciated.

    Hi Hemanth,
    In your start routine, are you by any chance adding or multiplying the number of records to the source_package? Something like copy source package into an internal table, add records to internal table and then copy it back to source package? If some logic of this sorts is in your start routine, you need to refresh your internal table. Otherwise, the internal table records goes on increasing with every data package. So, the processing time might increase as the load progresses. This is one common mistake I have seen. Please check your code if you have something like that and refresh the internal tables. See if this makes any difference.
    Thanks and Regards
    Subray Hegde

  • Need URGENT help on DATA PACKAGE

    Hi team,
    When we extract the data from the data source it's getting as a package by package before start process i need to add all the packages into one internal table .
    How do i do that ?.
    Pl help me it's URGENt appreciate your response. thanks.
    Regards,
    Senthil

    hi,
    you need to do the process in source system or BW.
    if it is in source system, extract through program/function module that coolects all the data package and process there and push to BW once procee is over.
    if it is to be done in BW- i hope in the start routine u can do only by package by package.
    please give ur actual requirment so that others can give suggesstion.
    Ramesh

  • Small network

    Probably not the best place for this post, but probably where I'll find the best people to answer my question.
    The problem... I wish to set up a very small network to use a printer and external drive with a wireless MacBook. without spending too much money.
    The MacBook (obviously) roams the house on a wireless connection. The DLink router sits handily in the cupboard under the stairs. I would like to place my printer and external drive under the stairs with the router and for them to be wirelessly accessable from the MacBook.
    The solution that comes to mind is to buy a Mac Mini for under the stairs and connect everything through this. But that seems a terrible waste of a Mini (and more inportantly Money!)
    Any better ideas of how to acieve this would be very much appreciated.
    With thanks,
    David.

    The mini will work and in fact has one advantage over the Apple Extreme Base Station but I think the AEBS is a better solution over all. The AEBS can share USB hard drives and with a USB hub it can share several. My wife and I each have our own for backing up and a third for our music and video collection. The AEBS can also share network printers and some USB printers. The last time I looked, the list of compatible USB printers hadn't been updated by Apple for quite some time, but network printers have really dropped in price. This is the sole advantage of a mini - you could hang any printer you wanted off of it.

  • Small Network Issue

    I have a small network with Time Capsule TC and two Airport Express bases AX.
    Everything seems to be working fine, but it makes me crazy to see the Status light of the AXs blinking constantly.
    In Airport utility all the base stations show Green Status light, but in the actual AX bases the Status light is blinking yellow, it has wireless Security WAP2.
    In other words: what can I do to have AX light solid green?
    Thank you

    #!/bin/bash
    export BROWSER=firefox-aurora
    export DE=xfce
    exec ck-launch-session dbus-launch openbox-session
    openbox autostart makes no references at all

  • Splitting data packages in CMOD

    Hello All,
    We am currently using a standard delivered SAP extractor.  However, we have written additional logic in CMOD for this extractor.  As a result of the CMOD there is a potential of 1 record expanding into hundreds of records which increases the size of the initial data package exponentially.  Is there a way in CMOD to split up the final data package into multiple data packages and send it to BW one at time?
    Thanks,
    BQ

    If you change it after the extraction process has taken place then....you cant
    Maybe you can include your logic in the extractor in order to select all the information in the extraction process...
    Regards

  • Reason for Error in Manual Update of Data Package

    Hey,
         Sometimes while loading huge amount of data, one of the data package doesnt get uploaded. For those data packages we do a manual update of the data package and it works well.
       My question is why do we get this error,
    Is it to do with the system contention or any IDOC issue?
    Please help me out to find the root cause of this, so that we can avoid the manual update of the data packges.

    We usually get this issue , when we are doing a lot of processing in the Update Rules. The Data package processing gets timed out.
    If it is happening very often then you would need to sit with your Basis person and modify the Timeout settings for your system.

Maybe you are looking for