Connected, but no data entry in table

Everything works fine on my live site, but using my local testing server I am unable to get data into my mySQL table based on a simple form. Configurations and code have been changed in both locations, and there is a connection to localhost. Are there Dreamweaver or PHP settings that might be causing this? Thanks for your help.

Have you installed MySQL and phpMyInfo on your system? Have you created the database on your local server? Can you edit the tables in your database using the Mysql control panel.  If the answer to these questions is yes, then perhaps a sample of your code might give an idea of what is happening?         

Similar Messages

  • Add fields to CAT2 data entry area table control

    Hi All,
    I have a requirement where in I need to add a column after every day ( Moday , Tuesday.. ) in CAT2 to enable the user to enter short text description for the time entry accountabiity. For this I will have to add 7 additional columns to the table control of the data entry area one beside each day. The description entered by the user should be updated in the database.
    I am thinking of using screen exit CATS0012. But not very sure how to go about it.
    Pls help.
    Thanks in Advance.
    Regards,
    Joshi

    Thanks Ankush. I am going for the CATS0005 option. I have created a customer project TSHEET within an existing PS package on the system. The system prompted me to obtain a transport number which I guess will attach the new project to the package in future.
    Could you advise further on how to do this in practise? My instructions say 'on the initial project administration screen select the Enhancement components field and choose Change. In fact there appear to be 2 options - either Components or Enhancements assignement. If I choose Enhancements assignment I am prompted to enter the name of an exit. I entered CATS0005 as I have to bring this into the customer project somehow. Within package MY_CATS_INTERFACE appeared however CATS0005 was not recognised within this package. Likewise CI_CATSDB was not recognised.
    Any advise useful
    Thanks
    Rachel

  • Rp-pppoe connected but no data transfer

    hi,
    i installed arch on 27th . Post installation i setup internet connection using rp-pppoe, which was a straight forward setup. I also got connected to net and downloaded data upon a system upgrade. Except this one instance, thereafter last two days if connected i am getting no data transfer after a connection is established.
    Upon ping of 8.8.8.8  i am getting this error of destination net unreachable & if you ping google.com i get the error unknown host.
    when i checked the pppoe status i get this message:
    link is up and running on interface ppp0 and no packet errors both RX and TX.
    The system upgrade did not happen on 27th due to the TZselec tconflicting files error.
    It is a fresh system.
    i am pasting below the lines from the messages.log of 27th when i got the data transfer happened for the first time and next of 29th with the same system but nothing happening except i am connected.
    27th:
    Jun 27 12:46:03 localhost kernel: [  117.794497] r8169 0000:02:00.0: eth0: link up
    Jun 27 12:46:06 localhost kernel: [  120.763326] eth0: no IPv6 routers present
    Jun 27 12:58:01 localhost kernel: [  835.641169] PPP generic driver version 2.4.2
    Jun 27 12:58:01 localhost pppd[888]: pppd 2.4.5 started by root, uid 0
    Jun 27 12:58:01 localhost pppd[888]: Using interface ppp0
    Jun 27 12:58:01 localhost pppd[888]: Connect: ppp0 <--> /dev/pts/0
    Jun 27 12:58:02 localhost pppoe[889]: PADS: Service-Name: ''
    Jun 27 12:58:02 localhost pppoe[889]: PPP session is 14295 (0x37d7)
    Jun 27 12:58:02 localhost pppd[888]: CHAP authentication succeeded: CHAP authentication success, unit 7436
    Jun 27 12:58:02 localhost pppd[888]: CHAP authentication succeeded
    Jun 27 12:58:02 localhost kernel: [  837.068606] PPP BSD Compression module registered
    Jun 27 12:58:03 localhost pppd[888]: Cannot determine ethernet address for proxy ARP
    Jun 27 12:58:03 localhost pppd[888]: local  IP address 117.204.173.123
    Jun 27 12:58:03 localhost pppd[888]: remote IP address 117.204.160.1
    Jun 27 12:58:03 localhost pppd[888]: primary   DNS address 218.248.255.212
    Jun 27 12:58:03 localhost pppd[888]: secondary DNS address 218.248.241.2
    Jun 27 13:01:01 localhost /USR/SBIN/CROND[934]: (root) CMD (run-parts /etc/cron.hourly)
    Jun 27 13:01:01 localhost anacron[938]: Anacron started on 2012-06-27
    Jun 27 13:01:01 localhost anacron[938]: Will run job `cron.daily' in 28 min.
    Jun 27 13:01:01 localhost anacron[938]: Will run job `cron.weekly' in 48 min.
    Jun 27 13:01:01 localhost anacron[938]: Will run job `cron.monthly' in 68 min.
    Jun 27 13:01:01 localhost anacron[938]: Jobs will be executed sequentially
    Jun 27 13:04:10 localhost -- MARK --
    Jun 27 13:16:48 localhost pppoe-stop: Killing pppd
    Jun 27 13:16:48 localhost pppd[888]: Terminating on signal 15
    Jun 27 13:16:48 localhost pppd[888]: Connect time 18.8 minutes.
    Jun 27 13:16:48 localhost pppd[888]: Sent 411268 bytes, received 16885685 bytes.
    29th :
    Jun 29 17:54:13 localhost kernel: [    5.019108] r8169 0000:02:00.0: eth0: link down
    Jun 29 17:54:14 localhost dhcpcd[675]: eth0: carrier acquired
    Jun 29 17:54:14 localhost kernel: [    6.632983] r8169 0000:02:00.0: eth0: link up
    Jun 29 17:54:14 localhost dhcpcd[675]: eth0: broadcasting for a lease
    Jun 29 17:54:15 localhost dhcpcd[675]: eth0: offered 192.168.1.2 from 192.168.1.1
    Jun 29 17:54:15 localhost dhcpcd[675]: eth0: acknowledged 192.168.1.2 from 192.168.1.1
    Jun 29 17:54:15 localhost dhcpcd[675]: eth0: checking for 192.168.1.2
    Jun 29 17:54:21 localhost dhcpcd[675]: eth0: leased 192.168.1.2 for 10080 seconds
    Jun 29 17:54:21 localhost dhcpcd[675]: forked to background, child pid 702
    Jun 29 17:54:21 localhost /usr/sbin/crond[751]: (CRON) STARTUP (1.4.8)
    Jun 29 17:54:21 localhost /usr/sbin/crond[751]: (CRON) INFO (Syslog will be used instead of sendmail.): No such file or directory
    Jun 29 17:54:21 localhost /usr/sbin/crond[751]: (CRON) INFO (running with inotify support)
    Jun 29 17:54:47 localhost kernel: [   39.424170] NET: Registered protocol family 10
    Jun 29 17:54:47 localhost kernel: [   39.441086] NET: Registered protocol family 4
    Jun 29 17:54:47 localhost kernel: [   39.458946] NET: Registered protocol family 5
    Jun 29 17:54:47 localhost kernel: [   39.480053] PPP generic driver version 2.4.2
    Jun 29 17:54:47 localhost pppd[811]: pppd 2.4.5 started by root, uid 0
    Jun 29 17:54:47 localhost pppd[811]: Using interface ppp0
    Jun 29 17:54:47 localhost pppd[811]: Connect: ppp0 <--> /dev/pts/0
    Jun 29 17:54:48 localhost pppoe[812]: PADS: Service-Name: ''
    Jun 29 17:54:48 localhost pppoe[812]: PPP session is 2021 (0x7e5)
    Jun 29 17:54:48 localhost pppd[811]: CHAP authentication succeeded: CHAP authentication success, unit 4257
    Jun 29 17:54:48 localhost pppd[811]: CHAP authentication succeeded
    Jun 29 17:54:48 localhost kernel: [   40.866233] PPP BSD Compression module registered
    Jun 29 17:54:48 localhost pppd[811]: not replacing existing default route via 192.168.1.1
    Jun 29 17:54:48 localhost pppd[811]: Cannot determine ethernet address for proxy ARP
    Jun 29 17:54:48 localhost pppd[811]: local  IP address 117.204.161.173
    Jun 29 17:54:48 localhost pppd[811]: remote IP address 117.204.160.1
    Jun 29 17:54:48 localhost pppd[811]: primary   DNS address 218.248.255.212
    Jun 29 17:54:48 localhost pppd[811]: secondary DNS address 218.248.241.2
    Jun 29 17:54:57 localhost kernel: [   49.523342] eth0: no IPv6 routers present
    Jun 29 18:01:01 localhost /USR/SBIN/CROND[876]: (root) CMD (run-parts /etc/cron.hourly)
    Jun 29 18:01:01 localhost anacron[881]: Anacron started on 2012-06-29
    Jun 29 18:01:01 localhost anacron[881]: Will run job `cron.daily' in 29 min.
    Jun 29 18:01:01 localhost anacron[881]: Will run job `cron.weekly' in 49 min.
    Jun 29 18:01:01 localhost anacron[881]: Will run job `cron.monthly' in 69 min.
    Jun 29 18:01:01 localhost anacron[881]: Jobs will be executed sequentially
    Jun 29 18:04:04 localhost pppoe-stop: Killing pppd
    Jun 29 18:04:04 localhost pppd[811]: Terminating on signal 15
    Jun 29 18:04:04 localhost pppd[811]: Connect time 9.3 minutes.
    Jun 29 18:04:04 localhost pppd[811]: Sent 0 bytes, received 926 bytes.
    Now in both the cases i got connected but as i said before except twice on  27th i am not able to have data transfer of any type either through ping or pacman upgrade.
    Upon the same time i cross checked on my other systems, ubuntu and windows and i had no problems with the same, to confirm any issue with my isp. As of now i am typing one of those systems only.
    I have googled a lot, read the other options of pppd,etc, but upon similar queries by other users found no viable solution to this problem.
    My hostname is also proper in the config files. I also once used static ip for a change to see if things go working
    but nothing is sucessful so far.
    The only viable solution i read so far upon google search where there was a success was a fresh installation of arch by an user and he got the same working. But definitely i wont go with that. As i said above i got connected twice on the day of the installation and post that also i am now getting connected but with no data transfer.
    Pls suggest what to do to get the connection working.
    regards

    Upon a search on the forums yesterday
    i read one user query on this forums who was advised the same and he had the same problem like me and it did not got solved with either methodds.
    Last his query was still open. I did not save that link.
    So i did not give that method a try so far, waiting for a solution for this method without going for new developments thinking a search for a day or two might get some viable results.
    i guess you are mentioning the topic of this link
    https://wiki.archlinux.org/index.php/Pppd [This article explains how to set up a point-ti-point connections using pppd and the kernel PPPoE driver. ]
    Further in my case the connection is happening and also getting disconnected. You can very well see the same in the  log file which i pasted.
    Jun 27 13:16:48 localhost pppd[888]: Connect time 18.8 minutes.
    Jun 27 13:16:48 localhost pppd[888]: Sent 411268 bytes, received 16885685 bytes.
    Jun 29 18:04:04 localhost pppd[811]: Connect time 9.3 minutes.
    Jun 29 18:04:04 localhost pppd[811]: Sent 0 bytes, received 926 bytes.
    So same thing but different results on 2 different days. So what the problem really is i dont understand. Infact what the problem is the system is also not giving out, except that a connection happens, no data transfer takes place and the connection is cut and the stats are also show. Just diving to implement some new method or a reinstall, without resolving this which looks sensible yet senseless too, simply does not appeal here. If really there is some bug or issue with that rp-pppoe package then there is some point to jump to another method.
    This must be solvable some way or other.
    Last edited by beopen (2012-06-29 17:30:16)

  • Internet sharing: connected but no data

    Using MB Air 10.6.8 as host to share a USB Mobile Modem Internet connection.
    Connected a MB 10.7 as client via Airport to use the shared, working USB connection of the host.
    When making the network diagnosis on the client every light is "green", but no data is transferred from the web, the Safari screens remain empty.
    Its not the first time that I am making a internet sharing setup, but first time unsuccessfully.
    Any hint is appreciated

    For what it's worth, I'm having the exact same problem. I have been using the same setup for almost two years, Mini with Internet Sharing turned on, sharing over wifi, but suddenly this morning it just stopped working. Full bars on all devices (iPad's, iPhone's, etc.), but no Internet connection. I have tried restarting, resetting, etc., but nothing works. Not a happy camper right now.

  • Simple tutorial for using alv-grid for data entry into table, please!

    Hi friends,
    I urgently need a basic, simple tutorial or step-by-step or sample code on the following:
    I want to have a alv-grid like entry list where i can add/remove additional lines/entries that then are saved into an internal table. Please help me with that, as i studied already some documents but do not really get the idea of how to do - <REMOVED BY MODERATOR>
    Thanks in advance,
    Edited by: Alvaro Tejada Galindo on Jan 11, 2008 6:18 PM

    hi clemens,
    follow this link it may be useful to u
    http://www.sap-basis-abap.com/sapab033.htm
    http://www.abapprogramming.blogspot.com/2007/04/alv-details.html
    for tutorial on alv:
    http://www.sapbrainsonline.com/TUTORIALS/TECHNICAL/ALV_tutorial.html
    i have pdf material also ican give it to you if u give your email id.
    hope this helps you
    regards,
    sravanthi

  • ASE 15.7 how to find data entry using table page nr

    Hi,
    I am looking for a dbcc () to get data when I know page number from a table
    Thank you

    Hi Isabella,
    What your asking is not supported by SAP .. although you can achieve the result by using dbcc page:
    http://wiki.scn.sap.com/wiki/display/SYBASE/DBCC+page
    The only problem is that the data is in binary format so you have to format it by yourself to a readable format.
    Regards,
    Adam

  • My app store is saying i have to change date and time setting to get a secure connection but the date and time are correct please help?

    My app store is telling me i have yo change date and time setting to download apps and idk what is wrong cause date and time are correct

    Is the Time Zone also correct?
    Also see:
    Can't connect to the iTunes Store

  • Merge Two Tables with the same columns but different data

    I have a table that has the following columns:
    Current Table Definition
    commonname
    family
    genus
    species
    subspecies
    code
    I have a number of entries that don’t fit the current table definition – that is that they only have a common name or description and a code. These records don’t actually represent a species but are needed for data entry because they represent an object that may be encountered in the study (Bare Ground – which isn’t a species but would need to be recorded if encountered). So I would really like 2 tables:
    Table 1 Miscellaneous
    name
    code
    Table 2 Plant Species
    commonname
    family
    genus
    species
    subspecies
    code
    I would like two tables so I can enforce certain constraints on my species table like requiring that the family, genus, species, subspecies combination is unique. I can’t do this if I have all the “other” records that don’t have a family, genus, species, or subspecies unless I put in a lot of dummy data into the fields to make each record unique. I don’t really want to do this because these miscellaneous records really don’t represent a specific species.
    So – the problem is that while I want this data separate I will need to point a column from another table to the code column in both tables.
    How is this best done? Table? View? Merge?

    Hi,
    Actually you don't have to use scope refs. Sorry but I misunderstood you earlier. Here is a complete example that does exactly what you want. Notice how I added the constraint to the materialized view. Also notice when we try to insert a code in tbl3 that doesn't exist in the view, we get an error. HTH.
    SQL> create table tbl1 (name varchar2(10), code varchar2(3) primary key);
    Table created.
    SQL> create table tbl2 (commonname varchar2(10), code varchar2(3) primary key);
    Table created.
    SQL> insert into tbl1 values ('n1','c1');
    1 row created.
    SQL> insert into tbl1 values ('n2','c2');
    1 row created.
    SQL> insert into tbl1 values ('n3','c3');
    1 row created.
    SQL> insert into tbl2 values ('name1','c1');
    1 row created.
    SQL> insert into tbl2 values ('name2','c2');
    1 row created.
    SQL> insert into tbl2 values ('name3','c3');
    1 row created.
    SQL> commit;
    Commit complete.
    SQL> create materialized view view1 as select name, commonname, tbl1.code from tbl1, tbl2 where tbl1.code = tbl2.code;
    Materialized view created.
    SQL> select * from view1;
    NAME COMMONNAME COD
    n1 name1 c1
    n2 name2 c2
    n3 name3 c3
    SQL> create table tbl3 (code varchar2(3), record varchar2(1));
    Table created.
    SQL> alter table view1 add constraint view1pk primary key (code); -- <-Note how I added a constraint to the view
    Table altered.
    SQL> alter table tbl3 add constraint tbl3fk foreign key (code) references view1(code);
    Table altered.
    SQL> insert into tbl3 values ('c1','r');
    1 row created.
    SQL> insert into tbl3 values ('c99','r');
    insert into tbl3 values ('c99','r')
    ERROR at line 1:
    ORA-02291: integrity constraint (RAJS.TBL3FK) violated - parent key not found
    SQL> spool of;
    -Raj Suchak
    [email protected]

  • How to copy data of a table to the same but on a DIFFERENT MACHINE?

    Hello experts,
    I want to copy all entries of table TE422 to the same, but on a <u>different machine</u>. I created a transport request (workbench request using TABU) intending to copy the entire table data from QA into DEV environment.
    However, this is not working, because, in our installation, QA is on one server, and DEV on another! So, in a way, each machine seem to be transparent to the other and I cannot, or, do not know, how to specify the DEV environment as target.
    I was told that I have to do something at the OS level. That is to say, have some kind of SQL query which collects data from one and uploads it onto the other.
    Now my question is: Is there a SAP way to do this, that is to say, forget the SQL thingy and do it with some SAP transaction? If yes, how? If not, do you have a sample code which does this please?
    Your help is greatly appreciated.
    Goharjou

    Method 1:
    You can transport using ALE/IDOC, Write a simple outbound function moule in Source System and Develop a Inbound function module in target system.
    Method 2:
    Download the entire table contents into a flat file from source system by developing a simple report using WS_DOWNLOAD function module. Then Develop a report using WS_UPLOAD in target system and upload thd flat file.
    Thanks & Regards,
    Vijay

  • Using an MS Excel like template for data entry into an Oracle table

    Does anybody know of a way, or development tool, that has the look and feel of MS Excel but can be used for data entry and be saved to an Oracle table and be embedded into a JSP page? I am currently using iSupport and have a need to allow users to enter data into an MS Excel like template, once the user saves the service request I need for the values to be saved in the attribute fields of the service request. I need the ability to map the cells of the template to the fields in the Oracle table. Any help would be greatly appreciated. Thanks in advance.

    An Excel spreadsheet may be stored in database table .
    http://www.oracle.com/technology/pub/articles/saternos_tables.html

  • How to set order of data entry in a table?

    a simple question, i know, but i cannot seem to find out how to do it...
    if i want to enter data in a two column table, how do i force Numbers to move the data entry box in a specified way on "enter", so i want to enter data in a1, b1, a2, b2, a3, b3 etc and want the correct cell is active after hitting enter. in excel, you just highlight the area you want to fill. this doesnt seem to work in numbers. any ideas?
    TIA
    nicco

    i rechecked that, yes, i must have been using enter.
    but still 2 keys on opposite sides of the keyboard

  • Data entry in the table

    Hi All,
    I havae a scenario..
    I have one ztable with the foloowing fields:
    1. enum
    2. Ecnum
    3. Add
    4. Date
    5. erole
    what I did is I made enum and ecnum fields as the key fields in the table and rest I didn't make as key fields. Now when I tried to enter data in this table using sm30....I am facing a problem...I tried putting the following record:
    1st rec.
    123
    012
    xyz
    010101
    1
    2nd record.
    321
    011
    abc
    020202
    1
    Now as soon as I enter the second record it says the record is already there..
    so can you tell me what's the problem...I mean I didn't make erole as the key field but then also I ma having this error...
    Thanks,
    Rajeev
    Edited by: Rajeev  Gupta on Oct 3, 2008 4:09 PM

    Hi Rob,
    I did not mean that you should always define 1 key field in the table always
    I was refering to the scenario he wants to create.
    i.e the key field should not be repeated for another entry even if there are 2 key fields in the table.
    It's is possible to achieve this with table maintenance modification events, But I guess the best way to restrict this would be to only define one key field for the table.
    regards,
    Advait

  • Extracting data from multiple tables using DB connect

    Hi,
       I am having different tables which are  having the same structure in oracle database but  there names are different.Now i have only one datasource at BI side.This datasource shld extract data from the  tables dynamically.How can i do it using DB Connect .
    Thnxs

    ahh I see - problem as you said then is if you then take on a new location!
    I would then put into the source system a table identifier and create a view across all the tables
    Then dbconnect from the view and use the selection parameter of table parameter if you wanted one infopackage per "location"
    If you do need to have a new table in the source then just expand the view and create a new ipak
    hence NO bw changes required that need a dev-q-p transport - just the ipak in prod and it;s the source systems problem to add the extra table to the view

  • Can you check for data in one table or another but not both in one query?

    I have a situation where I need to link two tables together but the data may be in another (archive) table or different records are in both but I want the latest record from either table:
    ACCOUNT
    AccountID     Name   
    123               John Doe
    124               Jane Donaldson           
    125               Harold Douglas    
    MARKETER_ACCOUNT
    Key     AccountID     Marketer    StartDate     EndDate
    1001     123               10526          8/3/2008     9/27/2009
    1017     123               10987          9/28/2009     12/31/4712    (high date ~ which means currently with this marketer)
    1023     124               10541          12/03/2010     12/31/4712
    ARCHIVE
    Key     AccountID     Marketer    StartDate     EndDate
    1015     124               10526          8/3/2008     12/02/2010
    1033     125               10987         01/01/2011     01/31/2012  
    So my query needs to return the following:
    123     John Doe                        10526     8/3/2008     9/27/2009
    124     Jane Donaldson             10541     12/03/2010     12/31/4712     (this is the later of the two records for this account between archive and marketer_account tables)
    125     Harold Douglas               10987          01/01/2011     01/31/2012     (he is only in archive, so get this record)
    I'm unsure how to proceed in one query.  Note that I am reading in possibly multiple accounts at a time and returning a collection back to .net
    open CURSOR_ACCT
              select AccountID
              from
                     ACCOUNT A,
                     MARKETER_ACCOUNT M,
                     ARCHIVE R
               where A.AccountID = nvl((select max(M.EndDate) from Marketer_account M2
                                                    where M2.AccountID = A.AccountID),
                                                      (select max(R.EndDate) from Archive R2
                                                    where R2.AccountID = A.AccountID)
                   and upper(A.Name) like parameter || '%'
    <can you do a NVL like this?   probably not...   I want to be able to get the MAX record for that account off the MarketerACcount table OR the max record for that account off the Archive table, but not both>
    (parameter could be "DO", so I return all names starting with DO...)

    if I understand your description I would assume that for John Dow we would expect the second row from marketer_account  ("high date ~ which means currently with this marketer"). Here is a solution with analytic functions:
    drop table account;
    drop table marketer_account;
    drop table marketer_account_archive;
    create table account (
        id number
      , name varchar2(20)
    insert into account values (123, 'John Doe');
    insert into account values (124, 'Jane Donaldson');
    insert into account values (125, 'Harold Douglas');
    create table marketer_account (
        key number
      , AccountId number
      , MktKey number
      , FromDt date
      , ToDate date
    insert into marketer_account values (1001, 123, 10526, to_date('03.08.2008', 'dd.mm.yyyy'), to_date('27.09.2009', 'dd.mm.yyyy'));
    insert into marketer_account values (1017, 123, 10987, to_date('28.09.2009', 'dd.mm.yyyy'), to_date('31.12.4712', 'dd.mm.yyyy'));
    insert into marketer_account values (1023, 124, 10541, to_date('03.12.2010', 'dd.mm.yyyy'), to_date('31.12.4712', 'dd.mm.yyyy'));
    create table marketer_account_archive (
        key number
      , AccountId number
      , MktKey number
      , FromDt date
      , ToDate date
    insert into marketer_account_archive values (1015, 124, 10526, to_date('03.08.2008', 'dd.mm.yyyy'), to_date('02.12.2010', 'dd.mm.yyyy'));
    insert into marketer_account_archive values (1033, 125, 10987, to_date('01.01.2011', 'dd.mm.yyyy'), to_date('31.01.2012', 'dd.mm.yyyy'));
    select key, AccountId, MktKey, FromDt, ToDate
         , max(FromDt) over(partition by AccountId) max_FromDt
      from marketer_account
    union all
    select key, AccountId, MktKey, FromDt, ToDate
         , max(FromDt) over(partition by AccountId) max_FromDt
      from marketer_account_archive;
    with
    basedata as (
    select key, AccountId, MktKey, FromDt, ToDate
      from marketer_account
    union all
    select key, AccountId, MktKey, FromDt, ToDate
      from marketer_account_archive
    basedata_with_max_intervals as (
    select key, AccountId, MktKey, FromDt, ToDate
         , row_number() over(partition by AccountId order by FromDt desc) FromDt_Rank
      from basedata
    filtered_basedata as (
    select key, AccountId, MktKey, FromDt, ToDate from basedata_with_max_intervals where FromDt_Rank = 1
    select a.id
         , a.name
         , b.MktKey
         , b.FromDt
         , b.ToDate
      from account a
      join filtered_basedata b
        on (a.id = b.AccountId)
    ID NAME                     MKTKEY FROMDT     TODATE
    123 John Doe                  10987 28.09.2009 31.12.4712
    124 Jane Donaldson            10541 03.12.2010 31.12.4712
    125 Harold Douglas            10987 01.01.2011 31.01.2012
    If your tables are big it could be necessary to do the filtering (according to your condition) in an early step (the first CTE).
    Regards
    Martin

  • Sharepoint list dataheet view error "Cannot connect to the server at this time. You can continue working with this list, but some data may not be available"

    I have a List which is having around 14000 items in it.while opening that list in datasheet view it is giving error .
    Below is a summary of the issue:
    After selecting datasheet view beow error occurs:
        "Cannot connect to the server at this time.  You can continue working with this list, but some data may not be available."
        "Unable to retrieve all data."
        The item counts displays say 100 out of 14000 items.
    Exporting List to excel is giving only 2000 records out of 14000 records.
    Other Observations   -  
    This is happening to only one list on the site .There are other lists in the site whose no. of records is equal to 8000 to 9000.They are working absolutely fine without any error.
    Also, If I am saving this list as a template and creating another list with it ,then it is working absolutely fine with 14000 records,so the issue does not seem to be related with no. of records as the template list is working fine.
    I have checked the Alternate access mapping setting ,its fine.
    It should not be related to lookup,datefield or any other column as the list created from it template is working fine with all these columns.
    I checked below links also ,but doesn't seem to work in my case.
    http://social.technet.microsoft.com/forums/en-US/sharepointadminprevious/thread/974b9168-f548-409b-a7f9-a79b9fdd4c50/
    http://social.technet.microsoft.com/Forums/en-US/smallbusinessserver/thread/87077dd8-a329-48e8-b42d-d0a8bf87b082
    http://social.msdn.microsoft.com/Forums/en-US/sharepointgeneral/thread/dc757598-f670-4229-9f8a-07656346b9b0

    I have spent two days to resolve this issue. Microsoft has released two KBs with reference to this issue...but are not appearing in search results at the top.
    I am sharing my finding.
    1. First install the
    KB2552989 (Hopefully you might have already installed it. The KB detetcts it and informs the user.)
    2. Then update registry by adding new key for data fetch timeout as mentioned inKB2553007
    These two steps resolved the issue in our environment. Hope it might help others as well.
    Pradip T. ------------- MCTS(SharePoint 2010/Web)|MCPD(Web Development) https://www.mcpvirtualbusinesscard.com/VBCServer/paddytakate/profile

Maybe you are looking for

  • AP Invoice and AP Credit Memo Reconciliation

    HI Expert, I just want to ask if there's a way to automate reconciliation of AP Invoice and AP Credit Memo related to each other? What I did was just open the AP invoice and i copied to AP Credit Memo. I assumed that this was automatically reconciled

  • Various problems with iPhone 3G

    I got my iPhone on September 17th. Since then, I have had the following problems: 1) Dropped calls 2) Delays in getting/sending text messages - sometimes they come all at once, sometimes they wait an hour to be delivered 3) It crackles when I hold th

  • How to load a jpg into my movie ???

    I want to load an image stored on my pc into my swf file. i know that i can use the loadMovie() method. but i didnt know how to enter the url of the target file. For example i have the file in the following directory - C:\Documents and Settings\Admin

  • Query SQL database using JSF problem!! Using CachedRowSet

    I'm using JSF and want to query data from SQL by using CachedRowSet like this, rowset.setCommand("SELECT * FROM dbo.SomeTable WHERE SomeField='"+ xyz +"'"); then check if the record does exist by using *if(rowset.next()) {/ do something with the retr

  • MINUS IN SQL PLUS

    Hi I have this query and it returns 4 records one of those is a duplicate SQL> col adjustment_reason format a30 SQL> SELECT account_no ,descr as adjustment_reason, 2 gl_id,amount as adjustmentamount 3 FROM src_misc_adj_rpt; ACCOUNT_NO ADJUSTMENT_REAS