Pre-test Questions used as Knowledge Builders

I need questions at the beginning of a course that will not be included when the learner clicks on the Retake and Review Quiz buttons. Pre-test questions work but they don't allow the use of text captions for incorrect and correct feedback. They use smart shapes. Are there any other limitations of Pre-test Questions?

There are a lot of limitations. You are expected to have pretest question in sequence at the start of the course. All navigation will be cancelled: by playbar and by TOC.
But what you say about the feedback being only smart shapes as text containers is not true at all. They follow the same styling as normal question slides. If you turned on Use Shapes for SFH in the Preferences, Defaults, it will be shapes. You can always change this even after having inserted the Quiz/Preset slides by going to the corresponding master slide, right-click on the shape and choose 'Revert to caption'. Although to me it is a mystery why you should prefer captions, just my personal opinion because shapes are so much easier to format.
If you use normal question slides, and do not add its score to the Total score, most system quizzing variables will not include those 'Knowledge' slides. But indeed, as you point out, Review and Retake will go to the first Question slide, which may be a knowledge slide. With a shared/advanced action it is pretty easy to circumvent this. I almost never use Pretest slides for its limitations. Whenever possible I use custom question slides instead of the default ones for Knowledge slides.

Similar Messages

  • Can we send pre-test score (Pre-test question slide) and post test score (Question slide) both in lms from captivate?

    can we send pre-test score (Pre-test question slide) and post test score (Question slide) both in lms from captivate?

    Pretest score normally is not sent to the LMS. However it is stored in a system variable, you could use JS to send it.

  • Setting a Conditional Action for Pre-Test questions: Skipping Content Slides

    OK boys and girls, let's see if this is even possible within one course SCO, or if this a multi-SCO solution (which means I'm toast, because we don't have the multi-SCO aggregator)?
    I want to set up a prescriptive pre-test. In my ideal world, answers to each individual question would either enable or disable the content slides connected to the question. If the user passes the pre-test, they could opt to skip the prescriptive stuff and complete the course. If the user fails the pretest, they would have to view the enabled content.
    Is this even possible in a single SCO world?
    Justenuf

    Jim belongs in that pantheon of good guys as well, so many here have pointed me in the right direction and helped me move forward with the capabilities of this software. And, in that spirit - here's my work around for the pretest/post test in a SCORM 1.2 environment:
    Created an entrance slide that allows opting totake the pretest or not - but only allows you to take it once.
    Created one test (take off the question x of x widget) split it and slide the content in between it. In this case, 10 questions with 5 in each set.
    Titled the first set of questions as a PreTest and the second set as a Post Test.
    Set my passing score at 4 correct. (the user only needs to get 4 out 5 on either set correct to pass - and I'm setting up an action that only allows the user to see the questions they missed if they fail the pretest.)
    Created variables for each "pretest" question and set them at 0. Created on Enter advanced actions on each slide following a question that toggled the variable to 1 if the question was answered correctly  if "CpQuizInfoLastSlidepointScored is equal to 1" then "Assign Question<x> with 1"
    Created an On Enter advanced action for each content area slide "Skip #" that allowed prescription:
    "If Question # is equal to 1 or cpQuizInfoPassFail is equal to 1, then Go to Next Slide"
    Only the content connected to the missed questions in the pretest is seen.
    Copied the same On Enter advanced action the Post test questions and renamed it "SkipPost #" that allowed prescription:
    "If Question # is equal to 1 or cpQuizInfoPassFail is equal to 1, then Go to Next Slide"
    Only the questions related to the reviewed content are seen
    The sum total of these activities is the following -
    The user can choose to take the pretest and complete the course by gettiong 4 or 5 questions correct and jump to the exit page.
         If the user takes the pretest and scores lower than 80%, the content for the missed questions is viewed and the user must answer the questions regarding
         that content. The user passes the course when he/she answers the fourth question correctly.
    The user can choose to view the content, take the post test and pass the course by answering 4 of 5 correctly.
    it's running right in preview, in web browser preview and on my LMS previewer, with one navigation error.
          the error: after failing the pretest, taking the prescribed content and anse wering the questions correctly, the course occasionally jumps to the results page
         and s displays a failure message, then freezes in place. it does NOT happen each time, which is why I am cy urrently scurrying back to my happy place to dwell
         on this mystery awhile.
    Justenuf

  • Pre and Post test Questions

    I have a project that requires a pre-test, post-test and
    knowledge reviews throughout the course. I have created 10 question
    slides (I did not use question pools). I am having two issues occur
    within the course. First I do not want to track or score the
    pre-test questions or the questions embedded within the course what
    I will refer to as Knowledge reviews.
    I do want the user to go through and receive feedback on each
    question answered incorrectly but I do not want a score to be
    collected/tracked or tallied as part of the sum total for the
    course. The post-test questions need to be the only tracked/scored
    questions with that value being submitted to the LMS.
    Currently all the questions are being tallied in the course.
    (i.e test results show 30 questions with a score when there should
    only be 10). What is the best way to include mutiple questions
    within my course (as pre-test and Knowledge reviews) but not have a
    score tracked until the user takes a post test.
    To correct this should I change the pre-test and knowledge
    reviews to a survey type question? Or should I leave the question
    as graded but set the point value to 0. I inially left the pre-test
    and knowldge reviews as graded questions but I deselected Report
    Answers under the reporting tab thinking this would prevent the
    passing of a score, which is not the case.
    If anyone can offer some advice or point me to some
    documentaiton that discusses this I would greatly appreciate any
    help.
    The second issue I have is with the progress indicator within
    the course. Is there a way to revmove this from the quesiton pages.
    My client does not want to see this indicator. The current work
    around I am using is to reduce the font size of the indicator to
    5pt, setting the font color to match my background and then
    positioning the indicator in a set loation on the page. While this
    workaround works I would like to remove it in total if possible.
    Thanks in advance,

    The first part sounds very weird. I just made a project which
    included knowledge checks in the middle of the course and they are
    not tracked in the final test. I used Graded questions in my course
    and I initially set the score per question to 0 points. After I
    found out that they were counted in the question total I removed
    the reporting for each of the questions. Now the points are not
    being counted and also the questions are not being counted in the
    final assesment - I just have the 6 questions out of 6 in the
    post-test even though I have 4 knowledge checks inside the course
    it self.
    Try and click F9 to open up the advanced interaction panel
    and make sure that the answers are not being reported. I know it's
    obvious, but sometimes Captivate does some weird things.
    As for your second question regarding the progress indicator
    you can remove that by going to Quiz - Quiz Preferences - Settings
    and then unchecking the box "Show Progress".
    Best regards,
    Michael

  • Can I have multiple pre tests and a final test?

    Hi,
    I want to create  unit whereby I have small pre-tests (5) to check knowledge, they must get answer ciorrect to continue. Then i want to create a final test at the end of the project.
    Is this posisble or do i need to have different projects?
    Many thanks
    FS

    I think trying to get feedback to work with Survey Questions using Advanced Actions would be a lot more trouble than it's worth. 
    Also, using Pretest questions is not the answer because they're very limiting in other ways. They MUST all be at the beginning of the project BEFORE the main quiz, and they disable the playbar for the entire module. (This is by design.)  The way Pretests currently work is not usually what authors are looking for.
    If setting ordinary quiz question slides to zero points and deselecting the option to Include in Quiz is NOT enough for you, then I suggest you resort to creating your own 'question' slides via other methods such as building them from default interactive objects available in Captivate.  You can use widgets or buttons or Smart Shapes or Text Entry boxes to replicate most of the other question types.

  • Way to Capture Pre-Test Score?

    Hi all -
    We're using OLM 11.5.10j and want to have a pre-test that, if passed, exempts you from needing to complete the course with which it's associated, and would also mark the overall course as complete. If the learner doesn't pass the pre-test, they would have to take the course + post-test. We also want to capture the pre-test and post-test score separately, so that we know what the score on the pre-test was and, if applicable, the score on the post-test.
    We thought about using two separate "courses" - pre-test, and course+post-test - as part of a Learning Path. That way, we could set the Learning Path to have "at least one component mandatory" and set a the pre-test as a mandatory pre-req on the course+post-test. However, that's a little clunky and really isn't a Learning Path.
    We've looked into other AICC-spec fields that are available in OLM to which the learning object would post additional values to show pre-test/post-test score, then report on those spare fields to determine the respective scores. However, we ran into issues understanding what fields were available in OLM for the LO to post to.
    Any suggestions on how to implement this? Thanks!

    Hello Lee
    You can use DFF attributes in OTA_DELEGATE_BOOKINGS table to store Pre-Test Score and Post-Test Score. Build a class just like any other class......
    Class has 3 Main sections
    1. Pre-Test Section
    2. Class
    3. Test
    If the employee passes the Pre-Test Section...Employee can either continue to take the class or else give the credit to employee for the entire class. (Storing the Pre-test Score in one of the attribute is similar to the way we change the date_status_changed field in OTA_DELEGATE_BOOKINGS table.)
    If employee doesnt clear the pre-test section then save the pre-test-score using package/plsql api to store in DFF attributes of OTA_DELEGATE_BOOKINGS table and continue...rest is the normal process as in other anyother class....
    Hope this helps.....
    I am also expecting some one to give a nice solution.
    Thanks,
    Srujan

  • Pre-Test in Captivate with rating scale questions

    What I want to do with Captivate:
    - First an Pre-Test with 35 Questions the learner has to answer. I have to use a scale from 1 to 6.
    - These questions belong to 13 groups of questions
    - The Results of the questions are to be summed up in grouping variables
    - The Score in the variables are responsible, which of the eLearning-Moduls should be highligthed in the TOC
    - At last the TOC should be displayed
    - I want to use for every of the following moduls an own project (to reduce the size of every project)
    - I have to import this all in the LMS ILIAS (Scorm 1.2 and Scorm 2004)
    - the learner should be able to interrupt the application at any time and resume again
    I have seen the Thread "how to make a Pre Test in Captivate 4" and I work out the script-example inside the Captivate-Help  and the ebook from Rick Stone.
    What is the recommendation
    1. to realize the slides with the questions (they should be self-created, but how to do?)
    2. to realize the code in the advanced actions
    3. to realize the working-together of the project with pre-test and toc and all the following modules
    Perhaps, someone has an example for me.
    Best regards
    Winfried

    Hello Winfried,
    Spent a couple of hours yesterday to work out a proposition. Must confess that I realized it in CP5, mostly to save some time with the Advanced Actions but it could be done in CP4 too. Beware: I did not bother a lot about the layout, graphics, just tried to find a workflow that will achieve your goal only using Captivate (no widgets, no to be written Flash-SWF's) within a reasonable amount of time. There is quite a lot of copy/paste possible, repetitive tasks that are a bit monotonous, sorry about it, and some discipline is necessary.
    To start with: I made some choices to simplfy.
    You showed me a screenshot with buttons to click on (little bit like the Likert Question slide) but I have preferred to have the user introduce a number from 1 to 6, which means you'll need a legend to explain on each page. If you really want the six buttons later on, you should look for a widget or a Flash SWF that returns a value = clicked button (1 to 6). To achieve this in Cp alone would be a tremendous work. You should have to take into account that a user can change his mind and click on another button... With my system (typing a number) this is no issue.
    Choose to put 13 questions on each slide, one for each question group. This makes it possible to prepare one slide and copy it twice (to have 39 possible questions). There will have to be done some minor changes on the copies, but most of the work can be taken over. If you prefer to have less questions/slide that would be also possible, but then you'll have different copies (p.e. a slide with 7 and a slide with 6 questions).
    Choose to have a submit button for each question that will trigger an advanced action. Mostly need this to check after completion of a slide if all questions have been answered. If you do not like those many 'Submit' buttons, in CP5 this could be easily replaced, in CP4 it will take some more working time.
    First screenshot of a question slide:
    In this screenshot you'll see 13 questions with 13 Text Entry Boxes. For the TEB I provided the Submit button and only the Failure Caption (red one: put all of them in the same spot). The value entered in the TEB is stored in a user variable, so I needed 13 user variables for this slide, labeled them:
    v_1  v_2   .... v_13.
    Same user variables will be used on the next question slide (if you have less questions/slide you'll need less user vars of this kind).
    TEB is validated for the values 1 up till 6. If another number is entered the Failure Caption shows up.
    The blue Text Captino (Please, answer all...) is initially hidden when entering the slide (action for On Enter slide).
    Each Submit button triggers an Advanced action (labeled first AddGroup1) on success (slightly different, you'll need 13 of then, but copy can limit the work), with variables:
    created one variable 'v_counter' to store the number of correct answers on the slide, will be used to check if all questions have been answered; this counter is set to the value 0 on entering the slide
    created 13 group variables  v_G1  v_G2.... v_G13 to store the total score per group
    labeled actions like AddGroup1:
    increments counter with 1
    adds value stored in v_1 to the variable v_G1
    because playhead shouldn't move on the Timeline, set the system var rdcmndGotoFrame = rdInfoCurrentFrame
    in the 12 copies of this action, only second action has to be changed
    Next button allows the user to go to the next slide, but triggers another advanced action, that I labeled TestCompleteSlide1. You'll need a slightly different action for each Question slide, but again copy does most of the work. This action:
    checks if the variable counter is different from 13
    if it is different:
    the blue Text Caption is showed to insist on answering all the questions (this is what had to be changed on copy, because each text caption has its proper ID, blue Text Caption on second question slide has a different ID from that on first)
    rdcmndGotoFrame = rdInfoCurrentFrame - 2 to put back the playhead
    if it is 13, jump to next slide
    Last slide is TOC: The modules to be taken will be marked with a green circle in this screenshot and get a click box over the name and the green circle. Those not to be taken will be marked with a red triangle and not have a click box. You'll have to configure the click boxes to open the proper module. Initially all click boxes are there but invisible. Same for all green circles and red triangles. They will get visible based on checking the value of the grouping variables v_G1  v_G2.... v_G13. Just a screenshot here, if you confirm that you want to try my proposition, will explain further. This was rather easy in CP5, will not be in CP4, that is why I didn't yet work it out in CP4:
    Lilybiri

  • How to Add Likert Scale Pre-Test w/Customized Feedback?

    I need to have two quizzes in my course:
    One is a Self-Assessment with a 5 point Likert-Scale. There are 6 questions in this set. I need three possible responses depending on how they score (0-4 is one response; 5-8 is another response; 9-12 is another response)
    A final quiz at the end with only 3 questions, again giving feedback based on how they respond, this time percentage-wise (they score less than 80% they get feedback, over 80% another feedback)
    I am a very basic Captivate 6 user and have no clue how to go about doing this. I've read many forums about pre-tests (which don't seem to include Likert Scales), variables, advanced actions. There HAS to be a simpler way...
    Thanks,
    Christina

    Hi Lilybiri,
    Thanks for the quick response. To answer your question...
    I'm using Captivate 6. I saw the pretest option but Likert Scale is not an option
    I can use a different type of question, but wasn't able to figure out how to have different responses score differently. When I look at the Quiz Properties for MCQ, I only see one overall score value. Here is an example of the type of question I'm using: I proactively develop relationships with those I work with. Options are Never = 1 pt; Sometimes = 2 pt; Often = 3 pt; Always = 4 pt; I Don't Know = 0 pt
    If they receive a total of 0-4 pts, their feedback should be: "You can considerably improve your influencing skills by applying the principles in this course"; If they receive a total of 5-8 pts, their feedback should be: "You are already applying some of the principles. Plan to enhance your usage after completing this course." If they receive 9+ pts, the feedback should be: "You are using your influencing skills regularly. Focus on the principles you tend to use less often and coach others on how to become an effective influencer."
    Yes, all I need for the Final quiz is the Pass/Fail. I already have that set up and it's working fine (and currently not including the Likert Scale questions = I figured out how to do that based on this link: http://goo.gl/hhjJe)
    Thanks,
    Christina

  • Pre-sales Questions

    What is the best way to contact Apple and get answers to pre-sale questions? Sometimes these forums work well, other times there is no response. When there is a response, is there a way one can determine it the responder is a representative from Apple?
    Is there an 800 number that one can call for pre-sales questions? Unfortunately the closest AppleStore is 60 miles from my home.

    1159/3605
    Hi jbmth,
    Responses you get here are from other users that volunteer to share their Apple knowledge.
    The quality is probably the best possible, since every time something goes wrong in one's advice, someone else would quickly jump in and offer corrections/additions.
    Determine representatives from Apple:
    They have a white apple in front of their names. You won't see an answer from an Apple employee more than once or twice a year here really.
    This is most a user-to-user forum.
    What kind of info do you need?
    Generally speaking, speculation about new products is quite very unwelcome by Apple standards.
    Here in the Discussions, it is simply forbidden by the Terms of Use.
    I hope this helps.
    Axl

  • Pre Tests in Captivate

    In one of the Developer Forum articles it mentions the
    ability to use pre-tests in Captivate.
    There is no documentation in HELP or in Knowledgebase on how
    to assign questions to a pre-test as opposed to a post-test, to
    give full credit for & ability to by-pass content w/ all
    correct answers in a pre-test.
    And... if any incorrect pre-test answers, to send learner
    through content & post-test, but in this case, only score user
    on answers in post-test, not all questions in course... Any
    suggestions? Thanks....

    You may be thinking that pre-testing is an actual feature of
    the product, which is not the case.
    Captivate only knows tests... its up to you to design them to
    be useful as a pre-test or a post-test. How you do this is
    determined by the delivery environment your content will be
    accessed from and an understanding of Captivate's reporting
    abilities.
    For example, you mentioned that you want to "give full
    credit", which implies that you're doing centralized reporting in
    some way. The type of reporting you're doing will determine how you
    create your course.
    If you're using an LMS via the SCORM standard, all navigation
    and pass/fail logic must reside in the LMS, so you just build your
    pre-test and make the LMS use the score from that determine whether
    the user passes the overall course. If they don't pass, make them
    take everything, if they do pass, give them credit for the whole
    thing.
    If your particular LMS doesn't let you do that and you can't
    find a way to trick the LMS into doing what you want (say, by
    assigning much higher score values to the questions in the pre-test
    to artificially give the user a score high enough to pass the
    entire course), then you may be out of luck. That's the nature of
    SCORM, and has nothing to do with Captivate.
    Bottom line, there's no documentation because Captivate is
    not intended to fit into any particular system or scheme. It's
    merely a tool that lets you create building blocks that you can
    (hopefully) slot into a separate solution using a "standards-based"
    interface, such as AICC/SCORM, or one of the other options listed
    in the Quiz Manager.
    The reality is that each deployment environment, especially
    the "standards-based" ones, have their own quirks and deployment
    requirements, so how you make Captivate work in your particular
    environment is unfortunately left up to you. That's just the nature
    of the game.

  • How can I skip/pass a lesson when the pre-test is passed?

    I'm creating a course with an overview and 3 lessons. Each is its own CP file (Captivate 6).
    This course will be hosted in our LMS, where I will require each lesson to be taken/passed.
    Lesson 1 is introductory/background material about my topic, and my users may or may not have had exposure to this information already. If they know the info, there's no need for them to take this lesson.
    I have added a pre-test to the lesson, that, if failed, forces the user to go through the lesson content and a graded quiz at the end. The score is reported to the LMS and all is well.
    If the user passes the pre-test, s/he is jumped to a congrats slide and instructed to exit the lesson. (No need to take the graded quiz - the questions are the same, and they've already "passed.")
    My question: How do I report a PASS for the lesson if the user doesn't take the graded quiz? I don't need to report any particular score.
    Thanks for any help you can offer!

    Lieve, I'd LOVE to see your blog post about this topic - especially since you always include examples. =)
    Request: I need help with reporting.
    Here's my project's Branch View (click to view larger):
    I've done as you suggested as far as I know. The navigation works perfectly, but when it comes to reporting, I'm not seeing what I expect/want.
    What I want:
    For the user to be able to skip the pre-test (the branching image shows this)If the user passes the pre-test, can exit the course with a Complete/Pass
    For the user to have X number of chances at the test. After the user a) passes or b) uses up X chances and still fails, I want to report back to my LMS (I'm testing in scormcloud right now, but will be using HealthStream) both completion AND pass or fail.
    What I've got:
    The user CAN skip the pre-test (the branching works, anyway)
    I can get Captivate to report Complete/Unknown or Complete/Pass (sometimes, even when the user fails the graded quiz), or Incomplete/Unknown.

  • Pre test not branching

    I followed the Adobe eLearning channel video and set the pre-test action to ......PointsScored is greater than 40 (as there are only 4 questions worth 10 points each).  The Action is to Jump to advanced section, or Else jump to beginner section.  Yet on taking the pre test pass or fail I arrive at the beginner section.  What am I missing?
    Thanks Jo

    Hi Joanne,
    Thanks for reaching Adobe Community. Please accept my apologies for the inconvenience.
    I tested the same thing at my end it works. The reason it is not working at your end because in your Advanced Actions Else statement is executing. The reason behind this is you have only 4 questions worth 10 points each and your If statement in the Advanced actions says that its should be greater than 40. In order for this to work you need to have minimum of 5 questions or set the statement as "is equal to 40". In addition to this please make sure you are using the variable cpQuizInfoPretestPointsScored as you have a Pretest Quiz.
    Hope this helps!
    Thanks!

  • Test questions for cerfitication on SBO2007

    Hi all,
    i am a sbo consultant and have about 2 years experience.  I am currently studying the E-learning curriculum with view to the get the certification as a SBO 2007 implementation consultant.  There are some self-test questions available with the curriculum, but it would be better if there were more. 
    any info on how get more test exam questions would be greatly appreciated.
    best regards,
    michael

    Hi,
    Please go through the first book and third book ( ie. Logistics  & Implementation ) . Why i am saying is you have some direct questions. when you go into the accounts book there wont be direct question. all the questions are based on business scenario.
    But my advice is to get learn all have knowledge in all the books, do the test well my best wishes to get success for you online examination.
    Regards
    Chidambaram

  • Random test questions quiz results not working

    I have a project with random test questions. It's published in HTML5...Each time I play the project in preview, the test results are accurate and correct at the end...However after I have published the project to my site, all the random test questions are NOT accurately scored in at the end of the quiz...It's tells me basically I got all the questions wrong, when in fact the on each questions my display is clearly stating it's correct.
    If I move the question out of the random test questions and back into the main project...it is scored correctly, but once it's back in the "pools" area, it is scored incorrectly in the final tally as a random test question...Don't have a clue why this is...it's really been time consuming trying to fix it...and my fortune so far fixing anything to do with Captivate is really low! Buggy! I'm using Captivate 8, so have no idea...
    Anyone have any idea what is going on?

    So obvious I don't know why I didn't think of that. LOL
    Many thanks that works.
    It would appear that the only way to adjust the pause is to add time to the
    slide as I am unable to slide the pause back.  But changing the slide of 3
    secs to 3.1 does work.
    Is there an easy way to do this for ALL slides or do I just have to  do it
    one by one.
    Again many thanks
    Andy
    Jeddah by the sea, Saudi Arabia.

  • VSTS load test QUESTION : While recording the webPerformanceTest, is it is possible to clear / force to clear cachewithout stopping or closing the IE on recorder?

    VSTS load test QUESTION : While recording the webPerformanceTest, is it is possible to clear / force to clear cache without stopping or closing the IE on recorder?
    I am facing issue, while recording the webtest, our application first time loading any screen calls some requests and 2nd time when browsed the same information if it is available at cache then it will not
    call the requests again.
    How can i force the VSTS to clear cache while recording, so that i can record those requests which gets called at the first time.
    Regards
    SatishK

    Hi SatishK,
    Thanks for your reply.
    >>but my question was while recording (During recording using web test recorder) i want to clear cache.
    Based on your issue,
     I did some research about it,
    but I still find the official document about it. Generally, I know that if we want to clear the web performance test by the
    Cashe Control properties, so I think that it is not possible to clear the cashes when we are recording the web performance.
    If you still want to this feature for web performance test, I suggest you could submit this feature request:
    http://visualstudio.uservoice.com/forums/121579-visual-studio. The Visual Studio product team is listening to user voice there. You
    can send your idea there and people can vote. If you submit this suggestion, I hope you could post that link here, I will help you vote it.
    Thanks for your understanding.
    Best Regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

Maybe you are looking for

  • How to download structure fields using  (Not querying DD03L)

    Hi Gurus, The function module 'DB_GET_TABLE_FIELDS' is used to download table fields but if I want to download structure fields, how do I do it ? I don't want to query DD03L because it then gives fields in alphabetical order but the actual structure

  • Regarding Passing two internal Tables from Driver Program to Smartform

    hi all,           I am Passing Two Internal Tables from Driver Program to Smartform it is going to dump and message is showing like this " one of the parameters eded according to the interface description was not specified ".           When i am pass

  • Definition of the best approach on how to do reporting between BPC and BW

    Hi, I need your opinion in the definition of the best approach on how to do reporting between BPC and BW. For example if we want to do reporting using BW on Actuals Vs Budget how should we manage this since technically BPC Model and BW InfoCube is di

  • Numeric IP in Recipient Email Address

    Hi all, I am doing some migration work to Messaging Server 2005Q4. I am wondering if it's possible to configure MS to accept recipient email addressses containing IP address rather than the domain name, such as, [email protected]? I tried but got bou

  • Large numbers and has been locked. The screen tells me to contact iTunes.

    I have an iPhone 5S 32. I was trying to write my code but the numbers in the screen showed very large numbers so I tried several times and it was blocked.  The message in the screen tells that i have to contact Itunes. reneefroncusco.