Query Issue with adding a static value
I have written a query in SAP that works the way I need it to. I am going to schedule it so that SAP does a file transfer of the output to another area where the data will be accumulated.
My problem is that I want the name of the system to appear at the end of each line but it is not in any table I can find to match up with the userid.
EX:
UserID Group System
JDOE HR Query Group P201
The field for system is not in any tables but I want to add it and give it a constant/static value all the time in the output.
Any ideas?
Thanks!
Justin Dauby -- SAP Security Analyst
Kimball International Inc.,
The name of the system is not in a table, but rather a structure called SYST. You can evaluate it using the system field sy-sysid.
Another option is to call function module RFC_SYSTEM_INFO. The export parameter RFCSI_EXPORT will deliver the system id from component RFCSYSID of the structure RFCSI.
Cheers,
Julius
Similar Messages
-
Issue with adding a new value to the Default Login Page
Hopefully this is an easy one guys, I've added a select list item to the standard login page (P101) to allow the user to select a DOMAIN as well as entering their username and password.
When the next page is displayed and I try to retrieve the value held in the new field it has been cleared, but the values entered into items
P101_USERNAME and P101_PASSWORD are still present.
I can't figure out why the values held in items P101_USERNAME and P101_PASSWORD are still present but P101_DOMAIN has been cleared.
I've tried creating a value on page zero and writing the domain value to it using pl/sql and javascript but as soon as the next page is requested the value is cleared.
Any Ideas how to get around this????
Thanks in advanceI suppose your login page contains a process like this:
BEGIN
wwv_flow_custom_auth_std.login (p_uname => v ('P101_USERNAME'),
p_password => :p101_password,
p_session_id => v ('APP_SESSION'),
p_flow_page => :app_id || ':1'
:t_random_number := :p101_random_number; /* this is something I added to my code */
END;Now, if you go to this page:
http://apex.oracle.com/pls/otn/f?p=31517:1
please note the random number assigned to you. Once you login, the same number will appear on the page 1 (near to the top).
So, that is how you could solve your problem.
:t_random_numberis an application item.
Denes Kubicek
http://deneskubicek.blogspot.com/
http://www.opal-consulting.de/training
http://apex.oracle.com/pls/otn/f?p=31517:1
------------------------------------------------------------------- -
Issue with printing of Currency Value.
Hi,
I have an issue with printing of Currency Value.
( Pleae let me know if any thread available already I searched it before posting but not found).
The Problem is as follows:
I have a field s_value TYPE MARM-SALK3. (which is of currency type).
The value of field is actually 1664.1450. Currency USD
But when I print this variable in ALV report it prints as 166414.50 though I used the same data type. (SALK3).
Pleaes help me to fix this.
Regards,
VenkatHi,
If you use ALV to display currency value, you must sure that currency amount links with currency unit. You can archive this by using ALV fieldcat:
wa_fieldcat-fieldname = 'SALK3'.
wa_fieldcat-tabname = 'ITAB'.
wa_fieldcat-cfieldname = 'WAERS'.
wa_fieldcat-ctabname = 'ITAB'.
append wa_fieldcat to gt_fieldcat.
Please check,
Regards, -
Facing issue with adding systems in SMSY - solution manager
Hi all,
We are facing an issue with the solution manager in tcode smsy. When i create a system in the corresponding component say ECC with product version ECC6.0.The next screen shows that the <SID> has been added successfully. But when i navigate to the path and i unable to see the system SID which i have created.
Again i have tried to create the same system with same SID, but it shows that " SID already exists "
Also i need another suggestion, we have planned to add 60 systems in the same solution manager. Kindly suggest me , how many systems we can add in single solution manager in maximum and for configuring EWA ?
Thanks
RajHi Raj,
You can delete the old system entry and then try creating the saem again..
Also while creating mention correct product and version and the installation number.
You can also use the assistant for this.
Have you added all your servers and DB?
regards,
Raja. G
Edited by: Raja Gunasekaran on Aug 28, 2009 1:26 PM -
Query issue with exception aggregation
Dear all,
I have to solve the following reporting issue with BEx:
Cube Structure:
Cube A:
Characteristics: Company Code, Article, Distribution Channel, Customer, FiscalYear/Period
Key-Figures: Sales Val., Sales Qty.
Cube B:
Characteristics: Company Code, Article, FiscalYear/Period
Key-Figures: COGS
I simply want to multiply: Sales Qty@COGS = NODIM(Sales Qty) * COGS,
but this calculation should be valid for all characteristics of my cube A, even if I do not have them available in Cube B (like Customer and Distribution Channel). Additionally the calculated totals of my characteristics must be correct so that I can apply in second step a Margin Calculation: Sales Val. - Sales Qty@COGS which is valid on single level, as well as on total level.
I started to setup calculated key-figures
COGS1 = NODIM(Sales Qty) * COGS with Exception aggregation TOTAL by Company Code
COGS2 = COGS1 with Exception Aggregation TOTAL by Article
and this worked fine for both characteristics. If I use COGS2 in my report I get correct figures in a drilldown by Company Code and Article.
I enhanced the calculation by
COGS3 = COGS2 with Exception Aggregation TOTAL by Distribution Channel, but the result in this case is 0. I guess the result is 0, as the characteristic Distribution Channel is not availble in Cube B.
Any ideas how to solve this? Is there probably a different (more elegant) approach, as I fear that having all the exception aggregations my query runtime/ressource consumption will be awful.
Thanks for any idea,
AndreasHi,
You should define a new selection for COGS having Constant Selection on DC as defined in following link for PRICE with CUSTOMER.
[http://help.sap.com/saphelp_nw70/helpdata/en/46/91f0f090ea13e8e10000000a155369/content.htm]
and then apply your formulas....
hope it will solve the problem...
Regards. -
Any issues with adding this card to my mac?
I would like to add a card to my mac pro that will enable me to capture video from certain types of HD video cameras.
1. Do I have a slot available to put in one of these cards?
2. Are there any potential issues with doing this?
3. I am guessing that by adding the card, it will create an open port on the back of the mac that will allow me to connect the camera.
http://www.blackmagic-design.com/products/intensity/
I have the Dual-Core Intel XeonProcessor Speed: 3 GHz.
Thanks!!!!The Intensity HDMI card is normally used to capture UNCOMPRESSED video from an HD vidcam.
Lets say, for example you get a Canon HV20 camera - it has both firewire and HDMI outs. The camera, being HDV, uses HDV compression to write to tape. So once video is on tape, you will be capturing HDV encoded video (just fine with FCP). The HDV standards dictate 1420x1080 even though the sensor of the HV20 is a native 1920x1080. But, since non-square pixels are used, consider it 1920x1080 aspect.
If you hook the HV20 to your Mac Pro using HDMI and the Intensity card, you can capture uncompressed, native 1920x1080 video. However, you better have enormous and extremely fast Raid storage, as it requires sustained data rates roughly, what, 100 times greater than HDV? And you better have a really long extension cord if you plan to use it in the field, as, you must remember, once written to tape it will be compressed HDV video, so the small gain you'll get from native uncompressed 1920x1080 video is only available at the time you shoot, and only out of the HDMI port.
HDV looks stunning as is, so unless you are a professional with special requirements (in which case I've not told you anything you don't already know), you may want to rethink this.
If you shoot well exposed HDV video, and edit with FCP or iMovie HD for that matter, and play back from tape to a 1920x1080i HDTV, it is as good as watching the Discovery Channel HD.
Good luck,
John F -
Issue with function module updated values
Hi Experts,
We have an issue with customized program. In this program function module 'BBP_PD_SC_GETDETAIL' not picking
the updated values when the UI was opened. But when we were executing the same function module directly by providing the values manually it is picking up the correct data.
We thought it was a problem with session killing or buffer refresh ? We dont know how to handle this issue.
If you have any suggestions that would be great help full for me . Please reply asap.
Regards,
ChanduHi,
Thanks for your quick reply.
I have passed the values to the function module in the program and also used the same values for manually testing of the function module. Getting the correct values when I was testing manually.
We thought it was a session killing problem as we developed the customized code. So any idea how to refresh the session even
though the UI(User Interface) was opened.
This issue was solved when the UI was closed and opened again.
Any other ideas on this issue ?
Regards,
Chandu -
Query: Issue with Output
Does anyone by anychance have any suggestions or thoughts?
Oracle: 10.2g
Data:
with t as (
select '39' time, to_date('11/14/2003 7:39:00 PM','MM/DD/RRRR hh:mi:ss pm') game_date,11122 team_id, 11122 home_team_id,null away_team_id,'parks' home_team_name, null away_team_name,'Steve' offensive_player_moniker, 'Smith' offensive_player_last_name, '7' play_result_id, '11' event_code_num, '11111' game_code from dual union all
select '32', to_date('11/14/2003 7:32:00 PM','MM/DD/RRRR hh:mi:ss pm'),24555, null, 24555, null, 'bombers', 'Chris', 'Carter', '7','11','11111' from dual union all
select '35',to_date('12/14/2008 1:35:00 PM','MM/DD/RRRR hh:mi:ss pm'),33211, null, 33211, null, 'carts', 'Marty', 'Booker','7','17','22222' from dual union all
select '30',to_date('12/14/2008 1:30:00 PM','MM/DD/RRRR hh:mi:ss pm'),15555, 15555, null, 'sharks', null, 'Bob', 'Sanders','7','17','22222' from dual union all
select '38',to_date('12/14/2008 1:38:00 PM','MM/DD/RRRR hh:mi:ss pm'),15555,15555, null, 'sharks', null, 'Marvin','Harrison','7','19','22222' from dual union all
select '44',to_date('12/14/2008 1:44:00 PM','MM/DD/RRRR hh:mi:ss pm'),15555,15555, null, 'sharks', null, 'Marvin','Harrison','7','17','22222' from dual union all
select '26',to_date('12/14/2008 1:26:00 PM','MM/DD/RRRR hh:mi:ss pm'),15555,15555, null, 'sharks', null, 'Marvin','Harrison','7','19','22222' from dual union all
select '38',to_date('12/16/2008 1:38:00 PM','MM/DD/RRRR hh:mi:ss pm'),32111, 32111, null, 'carts', null, 'Marty', 'Booker','7','17','33333' from dual union all
select '39',to_date('12/16/2008 1:39:00 PM','MM/DD/RRRR hh:mi:ss pm'),15555, null, 15555, null, 'sharks', 'Marvin','Harrison','7','11','33333' from dual
)Query
select substr(x, 1, instr(x, ',', -1) - 1) as game_stats,
play_result_id,
game_code
--Bulk Collect into player_results
from
select
(case when event_code_num = 17 then
team_name
|| ': '
||
(xmlagg(xmlelement(e, offensive_player_moniker
|| ' '
|| offensive_player_last_name
|| ' ('
|| to_char(time)
|| ' pen), ')).extract('//text()'))
else team_name
|| ': '
||
(xmlagg(xmlelement(e, offensive_player_moniker
|| ' '
|| offensive_player_last_name
|| ' ('
|| to_char(time)
|| '), ')).extract('//text()'))
end) x,
play_result_id,
game_code,
event_code_num
from
select
offensive_player_moniker,
offensive_player_last_name,
case
when home_team_id = team_id then home_team_name
when away_team_id = team_id then away_team_name
end) as team_name,
time minute,
team_id team_id,
play_result_id,
game_code as game_code,
event_code_num as event_code_num,
time,
game_date
--Bulk Collect into player_results
from
t
where event_code_num in (11,17,19,28)
and play_result_id = 7
group by
offensive_player_moniker,
offensive_player_last_name,
home_team_id,
away_team_id,
team_id,
home_team_name,
away_team_name,
play_result_id,
game_code,
event_code_num,
time,
game_date
order by time asc
group by team_name, play_result_id, game_code, event_code_num
);current output:
GAME_STATS PLAY_RESULT_ID GAME_CODE
1 carts: Marty Booker (35 pen) 7 22222
2 carts: Marty Booker (38 pen) 7 33333
3 parks: Steve Smith (39) 7 11111
4 sharks: Bob Sanders (30 pen), Marvin Harrison (44 pen) 7 22222
5 sharks: Marvin Harrison (26), Marvin Harrison (38) 7 22222
6 sharks: Marvin Harrison (39) 7 33333
7 bombers: Chris Carter (32) 7 11111
DESIRED OUTPUT
GAME_STATS PLAY_RESULT_ID GAME_CODE
1 carts: Marty Booker (38 pen) 7 33333
2 parks: Steve Smith (39) 7 11111
3 bombers: Chris Carter (32) 7 11111
4 sharks: Bob Sanders (30 pen), Marvin Harrison (26, 30 pen, 38,44 pen) 7 22222
5 carts: Marty Booker (35 pen) 7 22222
6 carts: Marty Booker (38 pen) 7 33333
7 sharks: Marvin Harrison (39) 7 33333-One of the issues is that im forced to do a group by 'event_code_num' whis is in turn causing the issue, but i needed in the output.
-Second issue is concatinating the results of players that scored more than once in the game like Marvin Harrison, above
-Third issue is that i want to sort the time in asscending order in the output of game_stats, like so Marvin Harrison +(26, 30 pen, 38,44 pen)+ and i want the result set to be sorted by home_team first followed by away team for each game.
Edited by: user652714 on Mar 17, 2009 9:51 AM
Edited by: user652714 on Mar 17, 2009 10:20 AMI've gone ahead and added notes as you had suggested frank, and thanks for the suggestions in regards to genral coding!!
Table Columns/Defenitions:
time : the minute during which a player scored
game_date : the date of the game and time at which a player scored
team_id : the teams id
home_team_id : the home teams id
away_team_id : the away teams id
home_team_name : the home team name
away_team_name : the away team name
offensive_player_moniker : offensive players first name
offensive_player_last_name : offensive players last na,e
play_result_id : the number of play that resulted
7 = Scored
event_code_num : the code number for a type of an event
11 = Scored / 17 = Penalty / 19 = Scored in extra time
game_code : the code of the game
DESIRED OUPUT
You were right my desired output was incorrect here it is again
bombers: Chris Carter (32) 7 11111
parks: Steve Smith (39) 7 11111
sharks: Marvin Harrison (26, 30 pen, 38, 44 pen), Bob Sanders (30 pen) 7 22222
carts: Marty Booker (35 pen) 7 22222
carts: Marty Booker (38 pen) 7 33333
sharks: Marvin Harrison (39) 7 33333
Breakdown of output:
sharks: Marvin Harrison (26, 30 pen, 38, 44 pen), Bob Sanders (30 pen) 7 22222
sharks = team name
Marvin Harrison = player name
(26, 30 pen, 38, 44 pen) = ascending order of minute at which player scored/commited penalty
***Note*** the 'pen' after after the minute value signifies that the player commited a penalty at that minute, (event_code_num = 17)
'7' = play_result_id
'22222' = game_code
Explination of code
--Outputs data and gets rid of extra comma in the output
select substr(x, 1, instr(x, ',', -1) - 1) as game_stats,
play_result_id,
game_code
--Bulk Collect into player_results
from
--subquery that formats player name and time scored/penalty commited into one string
select
(case when event_code_num = 17 then
team_name
|| ': '
||
(xmlagg(xmlelement(e, offensive_player_moniker
|| ' '
|| offensive_player_last_name
|| ' ('
|| to_char(time)
|| ' pen), ')).extract('//text()'))
else team_name
|| ': '
||
(xmlagg(xmlelement(e, offensive_player_moniker
|| ' '
|| offensive_player_last_name
|| ' ('
|| to_char(time)
|| '), ')).extract('//text()'))
end) x,
play_result_id,
game_code,
event_code_num
from
--Driving query, populates data
select
offensive_player_moniker,
offensive_player_last_name,
case
when home_team_id = team_id then home_team_name
when away_team_id = team_id then away_team_name
end) as team_name,
time minute,
team_id team_id,
play_result_id,
game_code as game_code,
event_code_num as event_code_num,
time,
game_date
--Bulk Collect into player_results
from
t
where event_code_num in (11,17,19,28)
and play_result_id = 7
group by
offensive_player_moniker,
offensive_player_last_name,
home_team_id,
away_team_id,
team_id,
home_team_name,
away_team_name,
play_result_id,
game_code,
event_code_num,
time,
game_date
order by time asc
group by team_name, play_result_id, game_code, event_code_num
); -
Issue with Run time parameter values
Hi All,
Am facing an error while loading EBS 11.5.10 Financial data. Problem is, my parameters in the SQL Qualifier query (run time parameters in the SQL query) were not being replaced with appropriate values during run time. This is happening for the task "SDE_ORA_Reverse_GLJournals_ImportReference_Extract" that have SQL query in 'mplt_BC_ORA_XactsimpGlrfExtract' mapplet.
Ihave checked in parameter file and it have values for those Run time parameters. For some reasons, these values were not being picked up by that SQL query.
Can someboday help me on this about possible reason(s) behind this?There are 5 tasks SDE_PSFT_PayrollFact_Total, SDE_PSFT_PayrollFact_Earnings, SDE_PSFT_PayrollFact_EarningsOther, SDE_PSFT_PayrollFact_Deductions, SDE_PSFT_PayrollFact_Taxes are failing all are with same error READER_1_1_1> PR_18012 SQL Error [
ORA-00936: missing expression Database driver error...
I verified the mapplet in that missing expression is $$LAST_EXTRACT_DATE value.
If you know any workaround for this issue please suggest
I am attaching one of the error logs.
DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
DIRECTOR> VAR_27028 Use override value [PSFT_9_0_HCM] for session parameter:[$DBConnection_OLTP].
DIRECTOR> VAR_27028 Use override value [12] for mapping parameter:[$$DATASOURCE_NUM_ID].
DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_PSFT_PayrollFact_Total.$$INITIAL_EXTRACT_DATE].
DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_PSFT_PayrollFact_Total.$$LAST_EXTRACT_DATE].
DIRECTOR> VAR_27028 Use override value [DEFAULT] for mapping parameter:[$$TENANT_ID].
DIRECTOR> TM_6014 Initializing session [SDE_PSFT_PayrollFact_Total_Full] at [Wed Dec 10 13:41:44 2008]
DIRECTOR> TM_6683 Repository Name: [Oracle_BI_DW_Base]
DIRECTOR> TM_6684 Server Name: [Oracle_BI_DW_Base_Integration_Service]
DIRECTOR> TM_6686 Folder: [SDE_PSFT_90_Adaptor_Test]
DIRECTOR> TM_6685 Workflow: [SDE_PSFT_PayrollFact_Total_Full]
DIRECTOR> TM_6101 Mapping name: SDE_PSFT_PayrollFact_Total [version 1]
DIRECTOR> TM_6827 [C:\Informatica\PowerCenter8.1.1\server\infa_shared\Storage] will be used as storage directory for session [SDE_PSFT_PayrollFact_Total_Full].
DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
DIRECTOR> TM_6703 Session [SDE_PSFT_PayrollFact_Total_Full] is run by 32-bit Integration Service [node01_HSCHBSCGN20031], version [8.1.1], build [0831].
MANAGER> PETL_24058 Running Partition Group [1].
MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
MANAGER> PETL_24001 Parallel Pipeline Engine running.
MANAGER> PETL_24003 Initializing session run.
MAPPING> CMN_1569 Server Mode: [ASCII]
MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
MAPPING> TM_6151 Session Sort Order: [Binary]
MAPPING> TM_6156 Using LOW precision decimal arithmetic
MAPPING> TM_6180 Deadlock retry logic will not be implemented.
MAPPING> TM_6307 DTM Error Log Disabled.
MAPPING> TE_7022 TShmWriter: Initialized
MAPPING> DBG_21075 Connecting to database [orcl], user [obaw]
MAPPING> TM_6007 DTM initialized successfully for session [SDE_PSFT_PayrollFact_Total_Full]
DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
MANAGER> PETL_24004 Starting pre-session tasks. : (Wed Dec 10 13:41:45 2008)
MANAGER> PETL_24027 Pre-session task completed successfully. : (Wed Dec 10 13:41:46 2008)
DIRECTOR> PETL_24006 Starting data movement.
MAPPING> TM_6660 Total Buffer Pool size is 32000000 bytes and Block size is 1280000 bytes.
READER_1_1_1> DBG_21438 Reader: Source is [HR90DEV], user [sysadm]
READER_1_1_1> DBG_21438 Reader: Source is [HR90DEV], user [sysadm]
READER_1_1_1> BLKR_16003 Initialization completed successfully.
WRITER_1_*_1> WRT_8147 Writer: Target is database [orcl], user [obaw], bulk mode [OFF]
WRITER_1_*_1> WRT_8124 Target Table W_PSFT_PAYROLL_F_TMP :SQL INSERT statement:
INSERT INTO W_PSFT_PAYROLL_F_TMP(EMPLID,EMPL_RCD,PAYGROUP,PAY_PERIOD_START_DT,PAY_PERIOD_END_DT,CHECK_DT,OFF_CYCLE,PAGE_NUM,LINE_NUM,SEPCHK,COMPANY,DEPTID,LOCATION_CD,PLAN_TYPE,BENEFIT_PLAN,BENEFIT_RCD_NBR,DEDCD,DED_CLASS,DED_SLSTX_CLASS,ADDL_NBR,PAY_EARN_START_DT,PAY_EARN_END_DT,ERNCD,COUNTRY,STATE,LOCALITY,TAX_CLASS,PAYROLL_TYPE,PAY_DETAIL_FLG,PAYROLL_AMOUNT,CHANGED_ON_DT,DATASOURCE_NUM_ID,INTEGRATION_ID,TENANT_ID,X_CUSTOM) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_PSFT_PAYROLL_F_TMP]
WRITER_1_*_1> WRT_8003 Writer initialization complete.
READER_1_1_1> BLKR_16007 Reader run started.
READER_1_1_1> PR_18039 Application Source Qualifier [mplt_BC_PSFT_PayrollFact_Total.Sq_Payroll_Total] is reading from PeopleSoft.
WRITER_1_*_1> WRT_8005 Writer run started.
WRITER_1_*_1> WRT_8158
*****START LOAD SESSION*****
Load Start Time: Wed Dec 10 13:41:46 2008
Target tables:
W_PSFT_PAYROLL_F_TMP
READER_1_1_1> PR_18008 Application SQ instance [mplt_BC_PSFT_PayrollFact_Total.Sq_Payroll_Total] User SQL Query [SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'TOTAL_GROSS' SUMMARY_TYPE,
PS_PAY_CHECK.TOTAL_GROSS SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >=
UNION ALL
SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'TOTAL_TAXES' SUMMARY_TYPE,
PS_PAY_CHECK.TOTAL_TAXES SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >=
UNION ALL
SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'TOTAL_DEDUCTIONS' SUMMARY_TYPE,
PS_PAY_CHECK.TOTAL_DEDUCTIONS SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >=
UNION ALL
SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'NET_PAY' SUMMARY_TYPE,
PS_PAY_CHECK.NET_PAY SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >= ]
READER_1_1_1> CMN_1761 Timestamp Event: [Wed Dec 10 13:41:46 2008]
READER_1_1_1> PR_18012 SQL Error [
ORA-00936: missing expression
Database driver error...
Function Name : Execute
SQL Stmt : SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'TOTAL_GROSS' SUMMARY_TYPE,
PS_PAY_CHECK.TOTAL_GROSS SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >=
UNION ALL
SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'TOTAL_TAXES' SUMMARY_TYPE,
PS_PAY_CHECK.TOTAL_TAXES SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >=
UNION ALL
SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'TOTAL_DEDUCTIONS' SUMMARY_TYPE,
PS_PAY_CHECK.TOTAL_DEDUCTIONS SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >=
UNION ALL
SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'NET_PAY' SUMMARY_TYPE,
PS_PAY_CHECK.NET_PAY SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >=
Oracle Fatal Error
Database driver error...
Function Name : Execute
SQL Stmt : SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'TOTAL_GROSS' SUMMARY_TYPE,
PS_PAY_CHECK.TOTAL_GROSS SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >=
UNION ALL
SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'TOTAL_TAXES' SUMMARY_TYPE,
PS_PAY_CHECK.TOTAL_TAXES SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >=
UNION ALL
SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'TOTAL_DEDUCTIONS' SUMMARY_TYPE,
PS_PAY_CHECK.TOTAL_DEDUCTIONS SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >=
UNION ALL
SELECT
PS_PAY_CHECK.COMPANY ,
PS_PAY_CHECK.PAYGROUP ,
PS_PAY_CHECK.PAY_END_DT ,
PS_PAY_CHECK.OFF_CYCLE ,
PS_PAY_CHECK.PAGE_NUM ,
PS_PAY_CHECK.LINE_NUM ,
PS_PAY_CHECK.SEPCHK ,
PS_PAY_CHECK.EMPLID ,
PS_PAY_CHECK.EMPL_RCD ,
PS_PAY_CHECK.DEPTID ,
'NET_PAY' SUMMARY_TYPE,
PS_PAY_CHECK.NET_PAY SUMMARY_AMOUNT,
PS_PAY_CHECK.CHECK_DT ,
PS_PAY_CHECK.LOCATION ,
PS_PAY_CHECK.UPDATE_DT
FROM
PS_PAY_CHECK
WHERE
PS_PAY_CHECK.PAYCHECK_STATUS IN ('A','F','R') AND
PS_PAY_CHECK.PAY_END_DT >=
Oracle Fatal Error]
READER_1_1_1> CMN_1761 Timestamp Event: [Wed Dec 10 13:41:46 2008]
READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_PSFT_PAYROLL_F_TMP] at end of load
WRITER_1_*_1> WRT_8035 Load complete time: Wed Dec 10 13:41:46 2008
LOAD SUMMARY
============
WRT_8036 Target: W_PSFT_PAYROLL_F_TMP (Instance Name: [W_PSFT_PAYROLL_F_TMP])
WRT_8044 No data loaded for this target
WRITER_1__1> WRT_8043 ****END LOAD SESSION*****
MANAGER> PETL_24031
***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
Thread [READER_1_1_1] created for [the read stage] of partition point [mplt_BC_PSFT_PayrollFact_Total.Sq_Payroll_Total] has completed. The total run time was insufficient for any meaningful statistics.
Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [mplt_BC_PSFT_PayrollFact_Total.Sq_Payroll_Total] has completed. The total run time was insufficient for any meaningful statistics.
Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_PSFT_PAYROLL_F_TMP] has completed. The total run time was insufficient for any meaningful statistics.
MANAGER> PETL_24005 Starting post-session tasks. : (Wed Dec 10 13:41:47 2008)
MANAGER> PETL_24029 Post-session task completed successfully. : (Wed Dec 10 13:41:47 2008)
MAPPING> TM_6018 Session [SDE_PSFT_PayrollFact_Total_Full] run completed with [0] row transformation errors.
MANAGER> PETL_24002 Parallel Pipeline Engine finished.
DIRECTOR> PETL_24013 Session run completed with failure.
DIRECTOR> TM_6022
SESSION LOAD SUMMARY
================================================
DIRECTOR> TM_6252 Source Load Summary.
DIRECTOR> CMN_1740 Table: [Sq_Payroll_Total] (Instance Name: [mplt_BC_PSFT_PayrollFact_Total.Sq_Payroll_Total])
Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
DIRECTOR> TM_6253 Target Load Summary.
DIRECTOR> CMN_1740 Table: [W_PSFT_PAYROLL_F_TMP] (Instance Name: [W_PSFT_PAYROLL_F_TMP])
Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
DIRECTOR> TM_6023
===================================================
DIRECTOR> TM_6020 Session [SDE_PSFT_PayrollFact_Total_Full] completed at [Wed Dec 10 13:41:48 2008] -
Advice Please: Adding Few Static Values to Stored Proc Returned Result Set
Hello -
The Stored Proc below returns about 1000 rows of result set when a date is passed to the query. In addition to that, there are about 5 rows of data (which is in my excel file) needs to be added to the result set. What is the best possible way?
What I have done is I created a new table which has that 5 static rows. It has just the addresses and amount and customer number. There is no invoice date in that table. I tried UNION and the query did not like the fact that I had something like that in
the first table:
SELECT
'0' as 'TYPE'
,'TBCCA' as 'Acc ID'
,'12882' as 'BEID'
, '' as ' # OF UNITS'
, Circuit_Total as 'AMT'
,'D' as 'D or C'
, '' as ' FDOC yyyymmdd'
, '' as ' LDOC yyyymmdd'
, Address as 'Brief Comment'
, city as 'Tax City'
, State as 'Tax State'
, Zip_Code as 'Tax Zip Code'
, INV_DATE as 'Invoice Date'
FROM [dbo].[TBC_MonthlyCircuitDetail]
WHERE INV_DATE = '2014-07-01'
AND [Circuit_Total] >0
UNION
SELECT 0
'0' as 'TYPE' --DID NOT LIKE THIS <<
,'TBCCA' as 'Acc ID'
,'12882' as 'BEID'
, '' as ' # OF UNITS'
, Circuit_Total as 'AMT'
,'D' as 'D or C'
, '' as ' FDOC yyyymmdd'
, '' as ' LDOC yyyymmdd'
, Address as 'Brief Comment'
, city as 'Tax City'
, State as 'Tax State'
, Zip_Code as 'Tax Zip Code'
, '' INV_DATE as 'Invoice Date'
FROM [dbo].[TBC_MonthlyCircuitDetailStaticValues]
WHERE INV_DATE = '2014-07-01'
AND [Circuit_Total] >0
Stored Proc below:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[SP_TBC_ORBIT Process Form]
-- Add the parameters for the stored procedure here
@INVOICE_DATE varchar(20)
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for procedure here
SELECT
'0' as 'TYPE'
,'TBCCA' as 'Acc ID'
,'12882' as 'BEID'
, '' as ' # OF UNITS'
, Circuit_Total as 'AMT'
,'D' as 'D or C'
, '' as ' FDOC yyyymmdd'
, '' as ' LDOC yyyymmdd'
, Address as 'Brief Comment'
, city as 'Tax City'
, State as 'Tax State'
, Zip_Code as 'Tax Zip Code'
, INV_DATE as 'Invoice Date'
FROM [dbo].[TBC_MonthlyCircuitDetail]
WHERE INV_DATE IN (@INVOICE_DATE)
AND [Circuit_Total] >0
END
GO
So, what do you suggest? As you can tell, I am quite a novice but I am learning by researching, asking questions and doing it. Thanks in advance!
Sanjeev JhaYour second Select statement is as below:
SELECT 0
'0' as 'TYPE'
--DID NOT LIKE THIS <<
,'TBCCA' as
'Acc ID'
,'12882' as
'BEID'
, '' as
' # OF UNITS'
, Circuit_Total as
'AMT'
,'D' as 'D or C'
, '' as
' FDOC yyyymmdd'
, '' as
' LDOC yyyymmdd'
, Address as
'Brief Comment'
, city as 'Tax City'
, State as
'Tax State'
, Zip_Code as
'Tax Zip Code'
, ''INV_DATE
as 'Invoice Date' -- CHANGE HERE
FROM [dbo].[TBC_MonthlyCircuitDetailStaticValues]
WHERE INV_DATE = '2014-07-01'
AND [Circuit_Total] >0
You are receiving Invoice date from INV_DATE column from table or you want to set it as blank?
Change it to either - ( '' as
'Invoice Date' ) or (INV_DATE as
'Invoice Date' )
-Vaibhav Chaudhari -
Unable to query records with data in exponential value ? ?
Hello,
I have a strange problem here, hope someone can answer my query.
I have 2 tables where the 1st col is a seq with datatype NUMBER(22) and NOT NULL constraint. Now, the value within that col is quite large and hence when I query the table in TOAD/SQL developer it displays output in exponential form. Now the issue is this, In one table I'm able to find/search a particular row by using the exp value within where clause and the same is not possible in other table.
The sample data is here.
I'm able to search record by using the below values in 1 table but same is not possible in another.
1200000211600013
1.20000021160001E15
Can anybody tell me what could be the possible reason for this or where should I look for to find an answer?
Thanks in Advance!!
ToufiqSeems there's no problem in Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
with
numbers as
(select 1200000211600013330 the_number from dual union all
select 1200000211600013333 from dual
select n.the_number,s.search_string
from numbers n,
(select :your_number search_string
from dual
) s
where n.the_number(+) = to_number(s.search_string)
THE_NUMBER
SEARCH_STRING
1200000211600013333
1200000211600013333
THE_NUMBER
SEARCH_STRING
1200000211600013333
1.200000211600013333E18
THE_NUMBER
SEARCH_STRING
1200000211600013333
12000002116.00013333E+8
THE_NUMBER
SEARCH_STRING
1200000211600013330
1.20000021160001333E+18
THE_NUMBER
SEARCH_STRING
1.20000021160001333E17
Even implicit conversion seems to work
select n.the_number,s.search_string,dump(s.search_string,16) search_dump,dump(n.the_number) result_dump
from numbers n,
(select :your_number search_string
from dual
) s
where s.search_string = n.the_number(+)
THE_NUMBER
SEARCH_STRING
SEARCH_DUMP
RESULT_DUMP
1200000211600013330
1.20000021160001333E18
Typ=1 Len=22: 31,2e,32,30,30,30,30,30,32,31,31,36,30,30,30,31,33,33,33,45,31,38
Typ=2 Len=11: 202,2,21,1,1,22,17,1,2,34,31
Regards
Etbin -
Issues with adding a line....
Can someone PLEASE give me some course of action to take here. I feel extremely frustrated and upset with Verizon, and none of the reps seem to want to be of any assistance at all……
Recently, my wife and I decided that it was time to consolidate our phones to one carrier. She previously had AT&T, and I have had Verizon for 11+ years. After calling and talking to SEVERAL Verizon reps, and insuring that my current features and promos would NOT change if I added my wife’s phone to my account, we went ahead and ported her line to Verizon. This is where all the issues begin.
To begin with, I called and talked to two different Verizon reps, who assured me that if I were to change my plan to a family share plan, I would CONTINUE receiving all discounts and promotions that I was already receiving on my primary line. They did note that some of these discounts would not be available to my “new” line, which was understandable, but again I was assured that I wouldn’t lose ANY promotions. The promotions that I am referring to are as follows:
18% Access Discount
Unlimited Email & Web
5 Friends and Family
Talk & Text Plus Data $20 Disc
Within the first week of adding my wife’s phone to my account, I noticed that I had indeed lost the “Talk & Text Plus Data $20 Disc”. I immediately called Verizon and inquired about why this promotion was lost, and I got several different arguments and rebuttals from the Verizon reps. The responses spanned the gamut of “There was never a $20 Disc on your account” to “That discount was only valid for 1 year upon enrolling”. I referred all the reps to the notes on my account that were written by the previous reps who assured me that this specific promo would remain. I also went as far as reading them the information I have on the promo stating that as long as I remain in a qualified plan, the $20 promo would remain for life. I was then told there was nothing they could do, and was transferred to “Loyalty”.
Once transferred to “Loyalty” I talked to a very nice rep, who pulled up the information on the discount and assured me AGAIN that I was eligible for this promotion and she put in what is called an “Inactive Pricing Request” form to have the promotion reinstated on my account. She also assured me that in the event that the promotion was not immediately applied to my account, she would set my account up for review when the new bill came out, and she would personally issue me the $20 credit.
To make a long story short, my new bill came, and there was no new credit. I called Verizon back, and asked to speak to “Loyalty”, and the rep gave me some run around about not being able to send me directly to loyalty without him attempting to help me first. Finally after a 10 – 15 minute conversation he transferred me to Loyalty. This time, the Loyalty rep told me that I was NOT eligible for the promotion. I had her review all the notes on my account from the previous rep, and she verified that red assured me I was eligible, had put in the inactive pricing request, and assured me she would give me the $20 credit if the promotion wasn’t added. The rep then told me that the other rep had bold-faced lied to me and she would be filing a complaint with her supervisor. I requested to speak to her supervisor.
Her supervisor got on the line and stated that she was aware of the situation, there was nothing that they could do, and that the other reps that assured me I was still eligible for the promotion would get complaints filed through their supervisors for inaccuracy. I asked this rep to EMAIL the last Loyalty rep that assisted me and have her call me. She obliged, and within 15 minutes I was back on the phone with the ORIGINAL Loyalty rep.
The ORIGINAL Loyalty rep reviewed all the new comments on my account, and AGAIN assured me that I was STILL ELIGIBLE for the promotion, and she was going to “escalate” the inactive pricing request. She also told me that she had discussed my situation with her supervisor, and after reading the promotion details he too agreed that I was eligible for the promo. She assured me that I would hear back from them regarding the status of the promotion being added back to my account before my bill due date.
Upon reviewing my bill I noticed that it was extremely high. Upon calling Verizon AGAIN, it was disclosed to me that since my wife’s number was ported into Verizon from AT&T, and they gave her a temporary number to use until the number was ported over, I was being billed TWO $35 line activation fees. One for her actual ported line, and the other for the “temporary number” that was only in existence for 1 hour, and never even used. Needless to say upon review, that $35 was credited back to me.
My dilemma now is what do I do? Even though several Verizon reps assured me that my discounts would remain, at least one of them didn’t. Since I have spent so much time fighting with Verizon reps on the phone, and awaiting my new bill, we are past the time to be able to cancel my wife’s line and revert everything back to the way it was. The benefit of this promotion was that I received a $20/mo discount through the life of my contract, with no end date. I have yet to hear back from the ORIGINAL Verizon Loyalty rep even though I have had her emailed again to contact me, and all the other Reps claim there is nothing that can be done, and I should’ve taken measures prior to upgrading to ensure my promotions would remain.spottedcatfish wrote:
If you want consistent answers, always ask for official written documentation. For instance, you can read the Terms & Conditions of service on this website, and you should also recieve a copy when you sign a contract in store. Verizon has too many employees to be able to train them all perfectly, and then you can add in the inconsistency from all the different distribution channels (premium retailers, third party retailers like Amazon and Best Buy, etc.). Should they strive for it? Absolutely. Will that help you as a customer when you get a dumb or new rep? Nope. So help yourself, get your information in writing or don't believe what you hear.
While in theory this is a great idea, the problem lies with Verizon once again. When talking to the Verizon reps who tell me my account doesn't qualify, and they are able to "quote" to me the reasons from the promotional documentation, when I request they send me written proof of why I don't qualify, I get the reply "We can't release internal Verizon documents to the public." I quote them what the documentation I have states, and they claim their documentation is different, yet they aren't allowed to release any documentation proving my account is in fact ineligible.
Also, when requesting documentation as to why the "inactive pricing request" was denied, the response I get is that my account is not eligible. I ask for documentation and get the reply that the "inactive pricing requests" are done by a third party that is not affiliated with Verizon, and they can't release those documents to me either.
It seems a little odd to me that they have all this documentation that conflicts with all the information I have, yet they can't provide me proof of anything that is being said. The problem is they want you to think you are wrong. They want you to second guess yourself and hope you give up. It is much easier to tell a customer "No", rather then take the time to fix the issues at hand. -
Access 2010, Sharepoint 2010 issue with adding field to access db table
I needed to add an additional field to one of my access tables contained in the sharepoint list.
My application is a front end Access DB (run locally), and the backend is the 'data' database contained in the sharepoint lists.
I added the field to the database (called BOM). When I open the access list (from sharepoint) , and also look at the database in access, the field is there and things are fine.
However, it does not show up in my linked front end. I deleted the linked sharepoint tables from my front end, and then re-imported. Now I see the BOM field I created, but it has replaced the 'ID' field, which was the primary key. I had
added the 'BOM' field as the last field in the table. Now it is the first field, having replaced the 'ID' field.
What's going on here?
Thanks in advance!Hi,
This is the forum to discuss questions and feedback for Microsoft Office, and the issue more related to Sharepoint 2010, I'll move your question to the Sharepoint 2010 forum
http://social.technet.microsoft.com/Forums/sharepoint/en-US/home?forum=sharepointgeneralprevious
The reason why we recommend posting appropriately is you will get the most qualified pool of respondents, and other partners who read the forums regularly can either share their knowledge or learn from your interaction with us. Thank you for your understanding.
George Zhao
TechNet Community Support -
Issue with setting a default value to a Tabular Form field
Hi -
I'm running into an issue setting the default value of a tabular form column. I'm trying to set the default value to the row number (#rownum#). This used to work in previous versions, but now it's returning 0. Is there a was to set this value in the default value attribute of the field with a substitution string?
Thank you in advance for help!Share with us what worked in previous versions.
Jeff -
Issue with creating List of Values in Business View Manager
Hi,
Could any one help me to figure out what this issue refers to?
When i created a List of Values in Business View Manager, it is throwing the following error!!!!
The List of Values used by this parameter cannot be based on a Business View derived from the current Data Foundation or Business Element.
Prior to creating LoV, i've done successfully with creating Data connection, Data foundation, Business element and Business View..
Thanks
ViswaHi Visaw,
Firstly,
-->Open Repository Explorer,
-->where you are have related DF,DC,BV,BEand LOV's
-->Right click on LOV's -->There you can find Schedule list of values.Then schedule list of values and check if it prompts for errors.
If you face the same problem, what is the error you are getting.
Else try to use the LOV's and proceed further.
-->Check whether the LOV's are pointing to the same Business View and check the Business view pointing to the same Datafoundation.
If you are facing the same problem then try with the below steps:
-->Try to Create a report at designer level Using then Business view
-->Create two new parameters and select the existing parameters(LOV parameter's)
-->Check whether it is working.
-->If you face the same problem ,Update the connection of Business view manager.
Go to Database tab-->Select Set DataSource location and update it with current Businessview Manager.
Try this ,and let us know.
Regards,
Navin
Maybe you are looking for
-
Table for Security deposit status
Hi, I have FM which gets the status but is there any way to get table related to status of security deposit. Thanks in Advance Regards Haricharan
-
Puchase Req. Creation from MRP Run
I am running MD03 of single item and the purchase requisition getting created is NB type, where as configuration is for EC type purchase requisition. Can you please help analysis this issue/. Thanks & Regards Milesh.
-
Hi all, I have a program where i need to display split screen ALV. One ALV table on the left and one the right. In the left ALV i display 10 table names.. If i click on the table name then the table should be displayed in the right side ALV. This i c
-
Calrification for Weblogic Clustering
Our J2EE application is multitiered where JSPs&Servlets(Frontend) runs in a weblogic instance and EJBs(Backend) runs in two weblogic instances which are clustered. Here the Frontend weblogic instance and one of the clustered Backend weblogic instance
-
Cost Center derivation from multiple statistical account assignments
Dear all, I have a) a statistical internal with actual posted cost center A and b) a statistical WBS with actual posted cost center B. When posting with both (a) and (b) the cost center derived is from the statistical order. Is there a way to influen